%run DEVDAN_sea.ipynb
Number of input: 3 Number of output: 2 Number of batch: 100 All Data
100% (100 of 100) |######################| Elapsed Time: 0:05:20 ETA: 00:00:00
=== Performance result === Accuracy: 91.92525252525252 (+/-) 6.080708747804398 Precision: 0.9192776971749151 Recall: 0.9192525252525252 F1 score: 0.9186067780776412 Testing Time: 0.0016018477353182707 (+/-) 0.0005982922835932849 Training Time: 3.236801999987978 (+/-) 0.33800646742671797 === Average network evolution === Total hidden node: 24.19 (+/-) 10.065480614456519 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=41, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 41 No. of parameters : 167 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=41, out_features=2, bias=True) ) No. of inputs : 41 No. of output : 2 No. of parameters : 84
100% (100 of 100) |######################| Elapsed Time: 0:04:56 ETA: 00:00:00
=== Performance result === Accuracy: 92.34646464646464 (+/-) 5.834611342795457 Precision: 0.9233072665145964 Recall: 0.9234646464646464 F1 score: 0.9230197967532785 Testing Time: 0.0015243930046004478 (+/-) 0.0005180127207044876 Training Time: 2.994499876041605 (+/-) 0.13171972905478677 === Average network evolution === Total hidden node: 24.19 (+/-) 11.160371857604027 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=42, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 42 No. of parameters : 171 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=42, out_features=2, bias=True) ) No. of inputs : 42 No. of output : 2 No. of parameters : 86
100% (100 of 100) |######################| Elapsed Time: 0:04:53 ETA: 00:00:00
=== Performance result === Accuracy: 92.54949494949494 (+/-) 5.947283120265988 Precision: 0.9252909406254234 Recall: 0.9254949494949495 F1 score: 0.925138674442084 Testing Time: 0.0015414724446306326 (+/-) 0.0004987503919768464 Training Time: 2.9658606269142846 (+/-) 0.06883233059119875 === Average network evolution === Total hidden node: 21.85 (+/-) 10.446410866895864 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=38, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 38 No. of parameters : 155 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=38, out_features=2, bias=True) ) No. of inputs : 38 No. of output : 2 No. of parameters : 78
100% (100 of 100) |######################| Elapsed Time: 0:05:38 ETA: 00:00:00
=== Performance result === Accuracy: 91.91111111111111 (+/-) 6.408404974795807 Precision: 0.9189998641859611 Recall: 0.9191111111111111 F1 score: 0.9185510936527751 Testing Time: 0.0016755238927976049 (+/-) 0.0007037523509520552 Training Time: 3.416408004182758 (+/-) 0.3602184100997241 === Average network evolution === Total hidden node: 20.18 (+/-) 9.477742347204844 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=36, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 36 No. of parameters : 147 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=36, out_features=2, bias=True) ) No. of inputs : 36 No. of output : 2 No. of parameters : 74
100% (100 of 100) |######################| Elapsed Time: 0:06:13 ETA: 00:00:00
=== Performance result === Accuracy: 91.9070707070707 (+/-) 6.1077075420523315 Precision: 0.9189647880251671 Recall: 0.919070707070707 F1 score: 0.9185060043371699 Testing Time: 0.0018411573737558693 (+/-) 0.0006024907397946985 Training Time: 3.7688460614946155 (+/-) 0.07927796664419721 === Average network evolution === Total hidden node: 23.77 (+/-) 10.982581663707307 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=40, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 40 No. of parameters : 163 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=40, out_features=2, bias=True) ) No. of inputs : 40 No. of output : 2 No. of parameters : 82
========== Performance occupancy ========== Preq Accuracy: 92.13 (+/-) 0.27 F1 score: 0.92 (+/-) 0.0 Precision: 0.92 (+/-) 0.0 Recall: 0.92 (+/-) 0.0 Training time: 3.28 (+/-) 0.3 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 39.4 (+/-) 2.15 50% Data
100% (100 of 100) |######################| Elapsed Time: 0:04:43 ETA: 00:00:00
=== Performance result === Accuracy: 91.39999999999996 (+/-) 6.293543268198517 Precision: 0.9140736880099033 Recall: 0.914 F1 score: 0.913226848749671 Testing Time: 0.0016753215982456399 (+/-) 0.0006445702898571082 Training Time: 2.860139251959444 (+/-) 0.04804606533367886 === Average network evolution === Total hidden node: 17.58 (+/-) 10.063975357680484 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=35, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 35 No. of parameters : 143 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=35, out_features=2, bias=True) ) No. of inputs : 35 No. of output : 2 No. of parameters : 72
100% (100 of 100) |######################| Elapsed Time: 0:04:49 ETA: 00:00:00
=== Performance result === Accuracy: 91.72424242424243 (+/-) 6.5125223729674975 Precision: 0.9170373414728702 Recall: 0.9172424242424242 F1 score: 0.9167278007318695 Testing Time: 0.0017863788990059284 (+/-) 0.0005833241113596126 Training Time: 2.916241585606276 (+/-) 0.06202416120809332 === Average network evolution === Total hidden node: 23.0 (+/-) 10.0687635785135 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=38, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 38 No. of parameters : 155 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=38, out_features=2, bias=True) ) No. of inputs : 38 No. of output : 2 No. of parameters : 78
100% (100 of 100) |######################| Elapsed Time: 0:04:49 ETA: 00:00:00
=== Performance result === Accuracy: 91.56464646464649 (+/-) 6.469076607039062 Precision: 0.9155145236230928 Recall: 0.9156464646464646 F1 score: 0.9150368448204095 Testing Time: 0.0018366515034376973 (+/-) 0.0006241159766270944 Training Time: 2.9216011533833512 (+/-) 0.044804060749780855 === Average network evolution === Total hidden node: 19.83 (+/-) 8.503005351050888 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=34, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 34 No. of parameters : 139 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=34, out_features=2, bias=True) ) No. of inputs : 34 No. of output : 2 No. of parameters : 70
100% (100 of 100) |######################| Elapsed Time: 0:04:46 ETA: 00:00:00
=== Performance result === Accuracy: 91.53535353535354 (+/-) 6.771782110775629 Precision: 0.915421042988246 Recall: 0.9153535353535354 F1 score: 0.9146105883630826 Testing Time: 0.0018059123646129262 (+/-) 0.0006539972605315815 Training Time: 2.8862785401970448 (+/-) 0.10908190014071514 === Average network evolution === Total hidden node: 20.35 (+/-) 8.54210161494231 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=34, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 34 No. of parameters : 139 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=34, out_features=2, bias=True) ) No. of inputs : 34 No. of output : 2 No. of parameters : 70
100% (100 of 100) |######################| Elapsed Time: 0:04:46 ETA: 00:00:00
=== Performance result === Accuracy: 90.92121212121211 (+/-) 7.283872772856909 Precision: 0.9089107968957689 Recall: 0.9092121212121212 F1 score: 0.9089962523471589 Testing Time: 0.0018373980666651871 (+/-) 0.0006019725634470285 Training Time: 2.8923514539545234 (+/-) 0.052094178243336405 === Average network evolution === Total hidden node: 18.0 (+/-) 9.691233151668573 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=34, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 34 No. of parameters : 139 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=34, out_features=2, bias=True) ) No. of inputs : 34 No. of output : 2 No. of parameters : 70
========== Performance occupancy ========== Preq Accuracy: 91.43 (+/-) 0.27 F1 score: 0.91 (+/-) 0.0 Precision: 0.91 (+/-) 0.0 Recall: 0.91 (+/-) 0.0 Training time: 2.9 (+/-) 0.02 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 35.0 (+/-) 1.55 25% Data
100% (100 of 100) |######################| Elapsed Time: 0:03:59 ETA: 00:00:00
=== Performance result === Accuracy: 91.37878787878788 (+/-) 6.1496041217296655 Precision: 0.9134581726611755 Recall: 0.9137878787878788 F1 score: 0.9134014220236669 Testing Time: 0.0017660555213388771 (+/-) 0.0007001542830051438 Training Time: 2.4162122986533423 (+/-) 0.040948938053211925 === Average network evolution === Total hidden node: 17.08 (+/-) 5.790820321854236 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=27, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 27 No. of parameters : 111 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=27, out_features=2, bias=True) ) No. of inputs : 27 No. of output : 2 No. of parameters : 56
100% (100 of 100) |######################| Elapsed Time: 0:03:58 ETA: 00:00:00
=== Performance result === Accuracy: 91.34747474747476 (+/-) 6.190712226030534 Precision: 0.9134210808651505 Recall: 0.9134747474747474 F1 score: 0.9127670902922834 Testing Time: 0.0017683192937061041 (+/-) 0.0006461541402412309 Training Time: 2.40095814309939 (+/-) 0.040973986613750096 === Average network evolution === Total hidden node: 19.75 (+/-) 6.630799348494871 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=31, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 31 No. of parameters : 127 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=31, out_features=2, bias=True) ) No. of inputs : 31 No. of output : 2 No. of parameters : 64
100% (100 of 100) |######################| Elapsed Time: 0:03:59 ETA: 00:00:00
=== Performance result === Accuracy: 90.64141414141412 (+/-) 7.349885041374196 Precision: 0.9063068097474899 Recall: 0.9064141414141414 F1 score: 0.9055924012407242 Testing Time: 0.0017155998885029493 (+/-) 0.0006470350971316688 Training Time: 2.4141599987492417 (+/-) 0.04049285263299466 === Average network evolution === Total hidden node: 14.86 (+/-) 6.648338138211683 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=27, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 27 No. of parameters : 111 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=27, out_features=2, bias=True) ) No. of inputs : 27 No. of output : 2 No. of parameters : 56
100% (100 of 100) |######################| Elapsed Time: 0:04:00 ETA: 00:00:00
=== Performance result === Accuracy: 91.05454545454545 (+/-) 6.945142517714682 Precision: 0.9103031200487123 Recall: 0.9105454545454545 F1 score: 0.9099273509970811 Testing Time: 0.0019661970812864978 (+/-) 0.0012452893604838657 Training Time: 2.4292060871316927 (+/-) 0.04447180485965453 === Average network evolution === Total hidden node: 16.86 (+/-) 6.519233083730017 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=28, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 28 No. of parameters : 115 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=28, out_features=2, bias=True) ) No. of inputs : 28 No. of output : 2 No. of parameters : 58
100% (100 of 100) |######################| Elapsed Time: 0:04:01 ETA: 00:00:00
=== Performance result === Accuracy: 90.7727272727273 (+/-) 7.237204915312672 Precision: 0.90740060769996 Recall: 0.9077272727272727 F1 score: 0.9071429720777046 Testing Time: 0.001739099772289546 (+/-) 0.0005713936475549902 Training Time: 2.430939387793493 (+/-) 0.08472854132452794 === Average network evolution === Total hidden node: 15.28 (+/-) 7.686455620115165 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=28, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 28 No. of parameters : 115 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=28, out_features=2, bias=True) ) No. of inputs : 28 No. of output : 2 No. of parameters : 58
N/A% (0 of 100) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 91.04 (+/-) 0.3 F1 score: 0.91 (+/-) 0.0 Precision: 0.91 (+/-) 0.0 Recall: 0.91 (+/-) 0.0 Training time: 2.42 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 28.2 (+/-) 1.47 Infinite Delay
100% (100 of 100) |######################| Elapsed Time: 0:03:19 ETA: 00:00:00
=== Performance result === Accuracy: 74.6050505050505 (+/-) 7.964103476811964 Precision: 0.7880601214303441 Recall: 0.7460505050505051 F1 score: 0.7092281958038471 Testing Time: 0.0016816842435586332 (+/-) 0.0006595471978042406 Training Time: 1.9759978381070225 (+/-) 0.04661539567161622 === Average network evolution === Total hidden node: 11.06 (+/-) 11.06 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 10 No. of parameters : 43 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
100% (100 of 100) |######################| Elapsed Time: 0:03:19 ETA: 00:00:00
=== Performance result === Accuracy: 75.94848484848484 (+/-) 7.867146411108086 Precision: 0.8083488730562184 Recall: 0.7594848484848484 F1 score: 0.7251637414309624 Testing Time: 0.0017073852847320865 (+/-) 0.0006031628478924322 Training Time: 1.9699685308668349 (+/-) 0.03765807210050614 === Average network evolution === Total hidden node: 12.24 (+/-) 12.24 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=13, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 13 No. of parameters : 55 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=13, out_features=2, bias=True) ) No. of inputs : 13 No. of output : 2 No. of parameters : 28
100% (100 of 100) |######################| Elapsed Time: 0:02:59 ETA: 00:00:00
=== Performance result === Accuracy: 57.881818181818176 (+/-) 9.066759843954854 Precision: 0.769403021109308 Recall: 0.5788181818181818 F1 score: 0.5557339157018713 Testing Time: 0.0015139724269057765 (+/-) 0.0005191192721505033 Training Time: 1.7734314432047835 (+/-) 0.23507604624415554 === Average network evolution === Total hidden node: 9.98 (+/-) 9.98 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 10 No. of parameters : 43 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
100% (100 of 100) |######################| Elapsed Time: 0:03:04 ETA: 00:00:00
=== Performance result === Accuracy: 80.28686868686869 (+/-) 6.326561714246609 Precision: 0.8111327238727221 Recall: 0.8028686868686868 F1 score: 0.7916855797773654 Testing Time: 0.0014902678402987394 (+/-) 0.0005939682096889087 Training Time: 1.8313889961050014 (+/-) 0.24555669559995505 === Average network evolution === Total hidden node: 2.52 (+/-) 2.52 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 2 No. of parameters : 11 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (100 of 100) |######################| Elapsed Time: 0:03:04 ETA: 00:00:00
=== Performance result === Accuracy: 83.60808080808081 (+/-) 5.931437844448929 Precision: 0.8355603943910512 Recall: 0.8360808080808081 F1 score: 0.8327194163280498 Testing Time: 0.0014853236651179767 (+/-) 0.0006089309475658081 Training Time: 1.8211949305100874 (+/-) 0.23378756533230002 === Average network evolution === Total hidden node: 7.96 (+/-) 7.96 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=8, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 8 No. of parameters : 35 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=8, out_features=2, bias=True) ) No. of inputs : 8 No. of output : 2 No. of parameters : 18
========== Performance occupancy ========== Preq Accuracy: 74.47 (+/-) 8.88 F1 score: 0.72 (+/-) 0.09 Precision: 0.8 (+/-) 0.02 Recall: 0.74 (+/-) 0.09 Training time: 1.87 (+/-) 0.08 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 8.6 (+/-) 3.67
%run DEVDAN_hyperplane.ipynb
Number of input: 4 Number of output: 2 Number of batch: 120 All Data
100% (120 of 120) |######################| Elapsed Time: 0:06:46 ETA: 00:00:00
=== Performance result === Accuracy: 91.85630252100842 (+/-) 2.4846296802495216 Precision: 0.9185662008403982 Recall: 0.918563025210084 F1 score: 0.9185630546253908 Testing Time: 0.0016164499170639936 (+/-) 0.0006308179755702654 Training Time: 3.409361683020071 (+/-) 0.11576049211825501 === Average network evolution === Total hidden node: 8.658333333333333 (+/-) 0.8514285381378496 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 9 No. of parameters : 49 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=2, bias=True) ) No. of inputs : 9 No. of output : 2 No. of parameters : 20
100% (120 of 120) |######################| Elapsed Time: 0:06:37 ETA: 00:00:00
=== Performance result === Accuracy: 91.7361344537815 (+/-) 1.9788686646683282 Precision: 0.9174344184343703 Recall: 0.9173613445378151 F1 score: 0.9173586657913918 Testing Time: 0.00160390990121024 (+/-) 0.0005827877406234725 Training Time: 3.3403382882350634 (+/-) 0.0797951751431891 === Average network evolution === Total hidden node: 8.383333333333333 (+/-) 0.6853628398317363 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 9 No. of parameters : 49 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=2, bias=True) ) No. of inputs : 9 No. of output : 2 No. of parameters : 20
100% (120 of 120) |######################| Elapsed Time: 0:06:27 ETA: 00:00:00
=== Performance result === Accuracy: 91.30168067226889 (+/-) 3.0117133567801484 Precision: 0.9132557785397316 Recall: 0.913016806722689 F1 score: 0.9130060434286155 Testing Time: 0.0015609304444128725 (+/-) 0.0005599276659532174 Training Time: 3.2499673887461173 (+/-) 0.18157636649205716 === Average network evolution === Total hidden node: 12.625 (+/-) 1.5867288153094508 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=13, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 13 No. of parameters : 69 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=13, out_features=2, bias=True) ) No. of inputs : 13 No. of output : 2 No. of parameters : 28
100% (120 of 120) |######################| Elapsed Time: 0:06:54 ETA: 00:00:00
=== Performance result === Accuracy: 91.3747899159664 (+/-) 3.698494322015696 Precision: 0.9139490818146658 Recall: 0.9137478991596638 F1 score: 0.9137390640896543 Testing Time: 0.0016546389635871438 (+/-) 0.0005697500949408061 Training Time: 3.4754735241417123 (+/-) 0.3521201829537271 === Average network evolution === Total hidden node: 10.916666666666666 (+/-) 1.2354711202164494 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 12 No. of parameters : 64 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
100% (120 of 120) |######################| Elapsed Time: 0:07:27 ETA: 00:00:00
=== Performance result === Accuracy: 92.27983193277309 (+/-) 2.1203923781459957 Precision: 0.9227995428796716 Recall: 0.922798319327731 F1 score: 0.9227983674936292 Testing Time: 0.0017458350718522271 (+/-) 0.0006467998789099099 Training Time: 3.757355964484335 (+/-) 0.21553145588596484 === Average network evolution === Total hidden node: 9.891666666666667 (+/-) 1.1090223422255998 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 11 No. of parameters : 59 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
========== Performance occupancy ========== Preq Accuracy: 91.71 (+/-) 0.35 F1 score: 0.92 (+/-) 0.0 Precision: 0.92 (+/-) 0.0 Recall: 0.92 (+/-) 0.0 Training time: 3.45 (+/-) 0.17 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 10.8 (+/-) 1.6 50% Data
100% (120 of 120) |######################| Elapsed Time: 0:05:50 ETA: 00:00:00
=== Performance result === Accuracy: 91.58823529411761 (+/-) 3.243304064616984 Precision: 0.9158913889413053 Recall: 0.9158823529411765 F1 score: 0.9158822255855401 Testing Time: 0.0017175834719874278 (+/-) 0.000627979051912992 Training Time: 2.940670784781961 (+/-) 0.09285771106225826 === Average network evolution === Total hidden node: 8.891666666666667 (+/-) 1.4537069473896342 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 10 No. of parameters : 54 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
100% (120 of 120) |######################| Elapsed Time: 0:05:48 ETA: 00:00:00
=== Performance result === Accuracy: 90.96554621848739 (+/-) 3.4593300214700076 Precision: 0.9098527037902825 Recall: 0.9096554621848739 F1 score: 0.9096463155161267 Testing Time: 0.0016743515719886588 (+/-) 0.0006054548508114766 Training Time: 2.9264531616403273 (+/-) 0.07038351093645351 === Average network evolution === Total hidden node: 7.366666666666666 (+/-) 1.2905640455070628 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 9 No. of parameters : 49 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=2, bias=True) ) No. of inputs : 9 No. of output : 2 No. of parameters : 20
100% (120 of 120) |######################| Elapsed Time: 0:05:22 ETA: 00:00:00
=== Performance result === Accuracy: 90.85714285714286 (+/-) 3.3291322065216025 Precision: 0.908832793807178 Recall: 0.9085714285714286 F1 score: 0.9085588296288148 Testing Time: 0.0017051416284897748 (+/-) 0.0005645431991933695 Training Time: 2.7088816907225537 (+/-) 0.33067538475833785 === Average network evolution === Total hidden node: 10.875 (+/-) 0.9709316831442536 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 12 No. of parameters : 64 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
100% (120 of 120) |######################| Elapsed Time: 0:04:11 ETA: 00:00:00
=== Performance result === Accuracy: 90.79495798319329 (+/-) 4.084235911578549 Precision: 0.9080071069722248 Recall: 0.9079495798319328 F1 score: 0.9079472815001712 Testing Time: 0.0014418894503296924 (+/-) 0.0004986817481041347 Training Time: 2.106316370122573 (+/-) 0.05737595003662739 === Average network evolution === Total hidden node: 11.058333333333334 (+/-) 0.8971792215357844 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 12 No. of parameters : 64 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
100% (120 of 120) |######################| Elapsed Time: 0:04:17 ETA: 00:00:00
=== Performance result === Accuracy: 91.22689075630255 (+/-) 3.4619345997490725 Precision: 0.9123051571718592 Recall: 0.9122689075630253 F1 score: 0.9122676853507298 Testing Time: 0.0014235272127039293 (+/-) 0.0005127906568978413 Training Time: 2.16469283464576 (+/-) 0.17187148708337796 === Average network evolution === Total hidden node: 11.033333333333333 (+/-) 1.667999467092907 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 12 No. of parameters : 64 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
========== Performance occupancy ========== Preq Accuracy: 91.09 (+/-) 0.29 F1 score: 0.91 (+/-) 0.0 Precision: 0.91 (+/-) 0.0 Recall: 0.91 (+/-) 0.0 Training time: 2.57 (+/-) 0.36 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 11.0 (+/-) 1.26 25% Data
100% (120 of 120) |######################| Elapsed Time: 0:03:59 ETA: 00:00:00
=== Performance result === Accuracy: 90.04033613445381 (+/-) 5.224687990004378 Precision: 0.9004311368867022 Recall: 0.9004033613445378 F1 score: 0.9004023537390634 Testing Time: 0.0015495184088955406 (+/-) 0.0005673006966479277 Training Time: 2.0059585691500113 (+/-) 0.13431909829980931 === Average network evolution === Total hidden node: 12.558333333333334 (+/-) 1.7068286055983735 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 14 No. of parameters : 74 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (120 of 120) |######################| Elapsed Time: 0:04:03 ETA: 00:00:00
=== Performance result === Accuracy: 89.9983193277311 (+/-) 6.110192382965803 Precision: 0.900059482624946 Recall: 0.8999831932773109 F1 score: 0.8999796347825514 Testing Time: 0.0016129237263142562 (+/-) 0.0005046553001506729 Training Time: 2.0459356287948225 (+/-) 0.04598585728298637 === Average network evolution === Total hidden node: 12.025 (+/-) 2.6059627139824295 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 14 No. of parameters : 74 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (120 of 120) |######################| Elapsed Time: 0:04:02 ETA: 00:00:00
=== Performance result === Accuracy: 90.21512605042017 (+/-) 3.546939470298307 Precision: 0.9021670871388804 Recall: 0.9021512605042017 F1 score: 0.9021508252729623 Testing Time: 0.0015794729986110655 (+/-) 0.0005107477592847049 Training Time: 2.035938004485699 (+/-) 0.027089092148911555 === Average network evolution === Total hidden node: 10.741666666666667 (+/-) 1.2941910815469082 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 12 No. of parameters : 64 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
100% (120 of 120) |######################| Elapsed Time: 0:04:02 ETA: 00:00:00
=== Performance result === Accuracy: 90.79411764705883 (+/-) 3.042729151936058 Precision: 0.9081263879798764 Recall: 0.9079411764705883 F1 score: 0.9079324403317255 Testing Time: 0.0016272448692001215 (+/-) 0.0005138433361262728 Training Time: 2.0324966246340455 (+/-) 0.020026302197294177 === Average network evolution === Total hidden node: 9.266666666666667 (+/-) 1.1883695646650592 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 10 No. of parameters : 54 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
100% (120 of 120) |######################| Elapsed Time: 0:04:01 ETA: 00:00:00
=== Performance result === Accuracy: 89.56302521008402 (+/-) 7.404789110600903 Precision: 0.8963435630290143 Recall: 0.8956302521008404 F1 score: 0.8955872091396918 Testing Time: 0.0015367339639102712 (+/-) 0.0005471801792978896 Training Time: 2.0300943631084025 (+/-) 0.019507861707926268 === Average network evolution === Total hidden node: 9.25 (+/-) 1.8984642916490861 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 11 No. of parameters : 59 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
N/A% (0 of 120) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 90.12 (+/-) 0.4 F1 score: 0.9 (+/-) 0.0 Precision: 0.9 (+/-) 0.0 Recall: 0.9 (+/-) 0.0 Training time: 2.03 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 12.2 (+/-) 1.6 Infinite Delay
100% (120 of 120) |######################| Elapsed Time: 0:03:22 ETA: 00:00:00
=== Performance result === Accuracy: 56.01092436974789 (+/-) 9.945750242442063 Precision: 0.7217668208630682 Recall: 0.5601092436974789 F1 score: 0.4614369828794625 Testing Time: 0.001488076538598838 (+/-) 0.0005172047536335391 Training Time: 1.6679233863574117 (+/-) 0.03211003987336133 === Average network evolution === Total hidden node: 2.2416666666666667 (+/-) 2.2416666666666667 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 2 No. of parameters : 14 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (120 of 120) |######################| Elapsed Time: 0:03:21 ETA: 00:00:00
=== Performance result === Accuracy: 77.69747899159665 (+/-) 9.217625263225672 Precision: 0.8087717543957658 Recall: 0.7769747899159664 F1 score: 0.7710047302207703 Testing Time: 0.0015065369485807018 (+/-) 0.0005019445951762001 Training Time: 1.6616621438194723 (+/-) 0.018153549181321938 === Average network evolution === Total hidden node: 4.641666666666667 (+/-) 4.641666666666667 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=5, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 5 No. of parameters : 29 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=5, out_features=2, bias=True) ) No. of inputs : 5 No. of output : 2 No. of parameters : 12
100% (120 of 120) |######################| Elapsed Time: 0:03:20 ETA: 00:00:00
=== Performance result === Accuracy: 86.4252100840336 (+/-) 3.2580403061559826 Precision: 0.8644107153168673 Recall: 0.8642521008403361 F1 score: 0.8642399442079186 Testing Time: 0.0015534312785172662 (+/-) 0.0005123165606415981 Training Time: 1.6594710850915988 (+/-) 0.01721401430479802 === Average network evolution === Total hidden node: 6.983333333333333 (+/-) 6.983333333333333 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=7, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 7 No. of parameters : 39 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=7, out_features=2, bias=True) ) No. of inputs : 7 No. of output : 2 No. of parameters : 16
100% (120 of 120) |######################| Elapsed Time: 0:03:21 ETA: 00:00:00
=== Performance result === Accuracy: 78.64789915966387 (+/-) 3.5135714966541003 Precision: 0.789667494038251 Recall: 0.7864789915966387 F1 score: 0.7859138592001494 Testing Time: 0.0015362350880598822 (+/-) 0.0005496850381586778 Training Time: 1.666143870153347 (+/-) 0.025121387579147808 === Average network evolution === Total hidden node: 4.716666666666667 (+/-) 4.716666666666667 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=4, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 4 No. of parameters : 24 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=4, out_features=2, bias=True) ) No. of inputs : 4 No. of output : 2 No. of parameters : 10
100% (120 of 120) |######################| Elapsed Time: 0:03:22 ETA: 00:00:00
=== Performance result === Accuracy: 76.63025210084034 (+/-) 5.227537856131268 Precision: 0.819552281461266 Recall: 0.7663025210084033 F1 score: 0.7562451892787713 Testing Time: 0.0014777604271383846 (+/-) 0.0005202872553361456 Training Time: 1.6731170085297913 (+/-) 0.023915686972821818 === Average network evolution === Total hidden node: 8.958333333333334 (+/-) 8.958333333333334 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=4, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 4 No. of nodes : 9 No. of parameters : 49 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=2, bias=True) ) No. of inputs : 9 No. of output : 2 No. of parameters : 20
========== Performance occupancy ========== Preq Accuracy: 75.08 (+/-) 10.14 F1 score: 0.73 (+/-) 0.14 Precision: 0.8 (+/-) 0.05 Recall: 0.75 (+/-) 0.1 Training time: 1.67 (+/-) 0.0 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 5.4 (+/-) 2.42
%run DEVDAN_weather.ipynb
Number of input: 8 Number of output: 2 Number of batch: 18 All Data
100% (18 of 18) |########################| Elapsed Time: 0:00:55 ETA: 00:00:00
=== Performance result === Accuracy: 75.19411764705882 (+/-) 2.743649477748097 Precision: 0.7448029154092382 Recall: 0.7519411764705882 F1 score: 0.7471241306691262 Testing Time: 0.0018139165990492877 (+/-) 0.0006275018204258435 Training Time: 3.249160850749296 (+/-) 0.15658638474850642 === Average network evolution === Total hidden node: 13.11111111111111 (+/-) 0.5665577237325317 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 15 No. of parameters : 143 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
100% (18 of 18) |########################| Elapsed Time: 0:00:52 ETA: 00:00:00
=== Performance result === Accuracy: 74.61764705882354 (+/-) 2.4353367430199704 Precision: 0.7401296279707679 Recall: 0.7461764705882353 F1 score: 0.742390592572663 Testing Time: 0.001689896864049575 (+/-) 0.000569321351826984 Training Time: 3.1122857682845173 (+/-) 0.04266007794062158 === Average network evolution === Total hidden node: 8.555555555555555 (+/-) 1.2120791238484128 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 11 No. of parameters : 107 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
100% (18 of 18) |########################| Elapsed Time: 0:00:52 ETA: 00:00:00
=== Performance result === Accuracy: 74.05882352941177 (+/-) 2.94279955215306 Precision: 0.7335877144267555 Recall: 0.7405882352941177 F1 score: 0.7361008165090327 Testing Time: 0.001869664472692153 (+/-) 0.00032897344308848044 Training Time: 3.1055023389704086 (+/-) 0.01389836774700889 === Average network evolution === Total hidden node: 13.38888888888889 (+/-) 1.0076865081787252 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 15 No. of parameters : 143 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
100% (18 of 18) |########################| Elapsed Time: 0:00:52 ETA: 00:00:00
=== Performance result === Accuracy: 73.85882352941177 (+/-) 3.077634466604928 Precision: 0.7277563598384913 Recall: 0.7385882352941177 F1 score: 0.7302970390142409 Testing Time: 0.001766807892743279 (+/-) 0.0006294903402457582 Training Time: 3.10989907208611 (+/-) 0.024054273087786827 === Average network evolution === Total hidden node: 11.555555555555555 (+/-) 1.8324913891634047 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 15 No. of parameters : 143 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
100% (18 of 18) |########################| Elapsed Time: 0:00:53 ETA: 00:00:00
=== Performance result === Accuracy: 73.1705882352941 (+/-) 4.845223810271955 Precision: 0.7204882096882087 Recall: 0.7317058823529412 F1 score: 0.7233670010440546 Testing Time: 0.001747439889346852 (+/-) 0.00042623322455200566 Training Time: 3.1214717275956096 (+/-) 0.04983836196406317 === Average network evolution === Total hidden node: 12.277777777777779 (+/-) 1.2385276005337753 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 14 No. of parameters : 134 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
========== Performance occupancy ========== Preq Accuracy: 74.18 (+/-) 0.69 F1 score: 0.74 (+/-) 0.01 Precision: 0.73 (+/-) 0.01 Recall: 0.74 (+/-) 0.01 Training time: 3.14 (+/-) 0.05 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 14.0 (+/-) 1.55 50% Data
100% (18 of 18) |########################| Elapsed Time: 0:00:40 ETA: 00:00:00
=== Performance result === Accuracy: 73.84117647058824 (+/-) 3.543188797003495 Precision: 0.7230565009461757 Recall: 0.7384117647058823 F1 score: 0.7187420830536322 Testing Time: 0.0015752035028794233 (+/-) 0.0004918285294427089 Training Time: 2.3648371275733497 (+/-) 0.01885704170853602 === Average network evolution === Total hidden node: 8.166666666666666 (+/-) 0.7637626158259734 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 10 No. of parameters : 98 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
100% (18 of 18) |########################| Elapsed Time: 0:00:40 ETA: 00:00:00
=== Performance result === Accuracy: 71.00588235294117 (+/-) 4.4800827970599055 Precision: 0.686885553081282 Recall: 0.7100588235294117 F1 score: 0.6843836439652206 Testing Time: 0.0013497717240277458 (+/-) 0.0004721289819953139 Training Time: 2.3712027072906494 (+/-) 0.01679469751286845 === Average network evolution === Total hidden node: 4.5 (+/-) 0.6009252125773316 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=6, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 6 No. of parameters : 62 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=6, out_features=2, bias=True) ) No. of inputs : 6 No. of output : 2 No. of parameters : 14
100% (18 of 18) |########################| Elapsed Time: 0:00:40 ETA: 00:00:00
=== Performance result === Accuracy: 70.86470588235294 (+/-) 2.93657286775119 Precision: 0.6860548442934544 Recall: 0.7086470588235294 F1 score: 0.6854351679566539 Testing Time: 0.0016374588012695312 (+/-) 0.00048341220600342267 Training Time: 2.3747700943666348 (+/-) 0.0330317355142675 === Average network evolution === Total hidden node: 7.444444444444445 (+/-) 0.7617394000445604 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 9 No. of parameters : 89 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=2, bias=True) ) No. of inputs : 9 No. of output : 2 No. of parameters : 20
100% (18 of 18) |########################| Elapsed Time: 0:00:40 ETA: 00:00:00
=== Performance result === Accuracy: 71.30588235294118 (+/-) 3.9717340394690086 Precision: 0.6942250575112038 Recall: 0.7130588235294117 F1 score: 0.6960409308094037 Testing Time: 0.0015714589287252987 (+/-) 0.000495141777230886 Training Time: 2.3792358005748078 (+/-) 0.017539975966491535 === Average network evolution === Total hidden node: 11.555555555555555 (+/-) 1.7069212773041351 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 14 No. of parameters : 134 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (18 of 18) |########################| Elapsed Time: 0:00:40 ETA: 00:00:00
=== Performance result === Accuracy: 71.98823529411764 (+/-) 2.761261935066726 Precision: 0.7002518380302651 Recall: 0.7198823529411764 F1 score: 0.6984202365610607 Testing Time: 0.0017408062429989084 (+/-) 0.0004277304053071516 Training Time: 2.3812663134406593 (+/-) 0.020216228023296538 === Average network evolution === Total hidden node: 10.722222222222221 (+/-) 0.9891385452647142 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 12 No. of parameters : 116 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
========== Performance occupancy ========== Preq Accuracy: 71.8 (+/-) 1.09 F1 score: 0.7 (+/-) 0.01 Precision: 0.7 (+/-) 0.01 Recall: 0.72 (+/-) 0.01 Training time: 2.37 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 10.2 (+/-) 2.71 25% Data
100% (18 of 18) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 70.4764705882353 (+/-) 2.9538667403461485 Precision: 0.6767315653230589 Recall: 0.704764705882353 F1 score: 0.6596906561897511 Testing Time: 0.0018053756040685316 (+/-) 0.0005124279597660776 Training Time: 2.007312900879804 (+/-) 0.009553546748353826 === Average network evolution === Total hidden node: 9.222222222222221 (+/-) 0.9749960430435692 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 11 No. of parameters : 107 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
100% (18 of 18) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 69.2764705882353 (+/-) 5.193381468047996 Precision: 0.6716694552604342 Recall: 0.692764705882353 F1 score: 0.6759251589337258 Testing Time: 0.0016939780291389016 (+/-) 0.00045877167359273206 Training Time: 2.017654446994557 (+/-) 0.02622129294336216 === Average network evolution === Total hidden node: 6.277777777777778 (+/-) 0.8695819912499182 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=8, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 8 No. of parameters : 80 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=8, out_features=2, bias=True) ) No. of inputs : 8 No. of output : 2 No. of parameters : 18
100% (18 of 18) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 71.68823529411766 (+/-) 3.7504325010106068 Precision: 0.6990627425274522 Recall: 0.7168823529411765 F1 score: 0.7007922792104412 Testing Time: 0.001744536792530733 (+/-) 0.0005503991362091711 Training Time: 2.0143559399773094 (+/-) 0.01785212149113314 === Average network evolution === Total hidden node: 12.666666666666666 (+/-) 0.7453559924999299 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 14 No. of parameters : 134 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (18 of 18) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 72.34117647058824 (+/-) 2.5566657568770204 Precision: 0.7048955411046023 Recall: 0.7234117647058823 F1 score: 0.6871293813727632 Testing Time: 0.001753414378446691 (+/-) 0.0005463205231052841 Training Time: 2.0055096990921917 (+/-) 0.015670665188447896 === Average network evolution === Total hidden node: 11.944444444444445 (+/-) 1.0786937688304221 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 14 No. of parameters : 134 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (18 of 18) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 68.88823529411765 (+/-) 3.8527121620308216 Precision: 0.6489131204840051 Recall: 0.6888823529411765 F1 score: 0.5848684981101093 Testing Time: 0.0016893639284021715 (+/-) 0.0005722985048259948 Training Time: 2.0114390289082245 (+/-) 0.013226225262863047 === Average network evolution === Total hidden node: 5.888888888888889 (+/-) 0.5665577237325317 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=7, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 7 No. of parameters : 71 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=7, out_features=2, bias=True) ) No. of inputs : 7 No. of output : 2 No. of parameters : 16
N/A% (0 of 18) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 70.53 (+/-) 1.33 F1 score: 0.66 (+/-) 0.04 Precision: 0.68 (+/-) 0.02 Recall: 0.71 (+/-) 0.01 Training time: 2.01 (+/-) 0.0 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 10.8 (+/-) 2.93 Infinite Delay
100% (18 of 18) |########################| Elapsed Time: 0:00:31 ETA: 00:00:00
=== Performance result === Accuracy: 68.61764705882354 (+/-) 4.071888605623808 Precision: 0.7218940864960282 Recall: 0.6861764705882353 F1 score: 0.5587592918706882 Testing Time: 0.001752166187061983 (+/-) 0.00042336070587403 Training Time: 1.648672244128059 (+/-) 0.02541755038935055 === Average network evolution === Total hidden node: 10.88888888888889 (+/-) 10.88888888888889 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 11 No. of parameters : 107 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
100% (18 of 18) |########################| Elapsed Time: 0:00:31 ETA: 00:00:00
=== Performance result === Accuracy: 68.67058823529412 (+/-) 4.167629415878635 Precision: 0.6442047985640439 Recall: 0.6867058823529412 F1 score: 0.5656881472183504 Testing Time: 0.0015804767608642578 (+/-) 0.0005966667818032134 Training Time: 1.6411755084991455 (+/-) 0.015749832669608207 === Average network evolution === Total hidden node: 8.166666666666666 (+/-) 8.166666666666666 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 9 No. of parameters : 89 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=2, bias=True) ) No. of inputs : 9 No. of output : 2 No. of parameters : 20
100% (18 of 18) |########################| Elapsed Time: 0:00:31 ETA: 00:00:00
=== Performance result === Accuracy: 68.6058823529412 (+/-) 4.11946178498161 Precision: 0.6590537216828479 Recall: 0.6860588235294117 F1 score: 0.5585938782968816 Testing Time: 0.001628553166108973 (+/-) 0.0004705479159420442 Training Time: 1.6872956893023323 (+/-) 0.023702669294865565 === Average network evolution === Total hidden node: 11.722222222222221 (+/-) 11.722222222222221 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 12 No. of parameters : 116 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
100% (18 of 18) |########################| Elapsed Time: 0:00:31 ETA: 00:00:00
=== Performance result === Accuracy: 68.28235294117648 (+/-) 3.968143037939808 Precision: 0.6076107173798099 Recall: 0.6828235294117647 F1 score: 0.5734919569935745 Testing Time: 0.0016302080715403838 (+/-) 0.00048178200675731263 Training Time: 1.643831491470337 (+/-) 0.015034646758235857 === Average network evolution === Total hidden node: 12.833333333333334 (+/-) 12.833333333333334 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=13, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 13 No. of parameters : 125 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=13, out_features=2, bias=True) ) No. of inputs : 13 No. of output : 2 No. of parameters : 28
100% (18 of 18) |########################| Elapsed Time: 0:00:31 ETA: 00:00:00
=== Performance result === Accuracy: 63.458823529411774 (+/-) 4.407888493242329 Precision: 0.6276336545183165 Recall: 0.6345882352941177 F1 score: 0.6308172086354512 Testing Time: 0.001984694424797507 (+/-) 0.00048360878721759 Training Time: 1.669492988025441 (+/-) 0.06746450995347832 === Average network evolution === Total hidden node: 11.777777777777779 (+/-) 11.777777777777779 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 12 No. of parameters : 116 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
========== Performance occupancy ========== Preq Accuracy: 67.53 (+/-) 2.04 F1 score: 0.58 (+/-) 0.03 Precision: 0.65 (+/-) 0.04 Recall: 0.68 (+/-) 0.02 Training time: 1.66 (+/-) 0.02 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 11.4 (+/-) 1.36
%run DEVDAN_rfid.ipynb
Number of input: 3 Number of output: 4 Number of batch: 280 All Data
100% (280 of 280) |######################| Elapsed Time: 0:14:38 ETA: 00:00:00
=== Performance result === Accuracy: 99.16200716845877 (+/-) 2.3856480045015664 Precision: 0.9916305545280637 Recall: 0.9916200716845878 F1 score: 0.9916086973596855 Testing Time: 0.001957467807236538 (+/-) 0.0008929287337201698 Training Time: 3.1446892078632094 (+/-) 0.05948944583883505 === Average network evolution === Total hidden node: 56.746428571428574 (+/-) 9.87366562929815 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=63, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 63 No. of parameters : 255 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=63, out_features=4, bias=True) ) No. of inputs : 63 No. of output : 4 No. of parameters : 256
100% (280 of 280) |######################| Elapsed Time: 0:14:39 ETA: 00:00:00
=== Performance result === Accuracy: 99.20860215053763 (+/-) 2.739999316405423 Precision: 0.9920817445976162 Recall: 0.9920860215053764 F1 score: 0.9920798790492309 Testing Time: 0.0019417449992190125 (+/-) 0.0005426591468695644 Training Time: 3.1491071545522273 (+/-) 0.040680363804178735 === Average network evolution === Total hidden node: 60.357142857142854 (+/-) 10.702517346448003 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=65, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 65 No. of parameters : 263 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=65, out_features=4, bias=True) ) No. of inputs : 65 No. of output : 4 No. of parameters : 264
100% (280 of 280) |######################| Elapsed Time: 0:14:38 ETA: 00:00:00
=== Performance result === Accuracy: 98.90537634408602 (+/-) 4.2741961182111705 Precision: 0.9890584415947367 Recall: 0.9890537634408603 F1 score: 0.9890468781008143 Testing Time: 0.0018783752208969499 (+/-) 0.0004986310746219353 Training Time: 3.143739750735649 (+/-) 0.04479412162251396 === Average network evolution === Total hidden node: 64.51428571428572 (+/-) 13.559649023210484 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=73, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 73 No. of parameters : 295 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=73, out_features=4, bias=True) ) No. of inputs : 73 No. of output : 4 No. of parameters : 296
100% (280 of 280) |######################| Elapsed Time: 0:14:38 ETA: 00:00:00
=== Performance result === Accuracy: 99.22078853046594 (+/-) 1.8266441276014085 Precision: 0.9922022505416935 Recall: 0.9922078853046595 F1 score: 0.9922010427154767 Testing Time: 0.0019096636003063572 (+/-) 0.0015309608370982178 Training Time: 3.1443920588407894 (+/-) 0.053524463308117755 === Average network evolution === Total hidden node: 59.917857142857144 (+/-) 9.925636128280162 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=65, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 65 No. of parameters : 263 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=65, out_features=4, bias=True) ) No. of inputs : 65 No. of output : 4 No. of parameters : 264
100% (280 of 280) |######################| Elapsed Time: 0:14:42 ETA: 00:00:00
=== Performance result === Accuracy: 99.26594982078855 (+/-) 1.7951612813772146 Precision: 0.9926561145859011 Recall: 0.9926594982078853 F1 score: 0.9926516957440156 Testing Time: 0.00179285644203104 (+/-) 0.000505124277207529 Training Time: 3.1589536316505895 (+/-) 0.047892787624809585 === Average network evolution === Total hidden node: 58.19642857142857 (+/-) 9.986812094774114 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=64, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 64 No. of parameters : 259 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=64, out_features=4, bias=True) ) No. of inputs : 64 No. of output : 4 No. of parameters : 260
========== Performance occupancy ========== Preq Accuracy: 99.15 (+/-) 0.13 F1 score: 0.99 (+/-) 0.0 Precision: 0.99 (+/-) 0.0 Recall: 0.99 (+/-) 0.0 Training time: 3.15 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 66.0 (+/-) 3.58 50% Data
100% (280 of 280) |######################| Elapsed Time: 0:11:09 ETA: 00:00:00
=== Performance result === Accuracy: 98.67347670250895 (+/-) 4.41665296309441 Precision: 0.9867210092793409 Recall: 0.9867347670250896 F1 score: 0.986718382713746 Testing Time: 0.0018220904907445326 (+/-) 0.000548616112128294 Training Time: 2.396297507815891 (+/-) 0.035531816971533674 === Average network evolution === Total hidden node: 53.61071428571429 (+/-) 12.635631005943761 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=61, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 61 No. of parameters : 247 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=61, out_features=4, bias=True) ) No. of inputs : 61 No. of output : 4 No. of parameters : 248
100% (280 of 280) |######################| Elapsed Time: 0:11:09 ETA: 00:00:00
=== Performance result === Accuracy: 97.9010752688172 (+/-) 8.435249830562935 Precision: 0.9790886190266312 Recall: 0.9790107526881721 F1 score: 0.9789715977410168 Testing Time: 0.0019059899032756846 (+/-) 0.0004774420152486833 Training Time: 2.396136726529795 (+/-) 0.030635057666078932 === Average network evolution === Total hidden node: 57.003571428571426 (+/-) 14.776549523360536 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=68, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 68 No. of parameters : 275 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=68, out_features=4, bias=True) ) No. of inputs : 68 No. of output : 4 No. of parameters : 276
100% (280 of 280) |######################| Elapsed Time: 0:11:10 ETA: 00:00:00
=== Performance result === Accuracy: 98.39784946236558 (+/-) 6.626145321889703 Precision: 0.9839858995223008 Recall: 0.9839784946236559 F1 score: 0.9839664371313739 Testing Time: 0.0018169991004424284 (+/-) 0.0005103784658928544 Training Time: 2.3986312685046998 (+/-) 0.028100286029922046 === Average network evolution === Total hidden node: 54.75714285714286 (+/-) 12.518769581581678 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=62, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 62 No. of parameters : 251 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=62, out_features=4, bias=True) ) No. of inputs : 62 No. of output : 4 No. of parameters : 252
100% (280 of 280) |######################| Elapsed Time: 0:11:12 ETA: 00:00:00
=== Performance result === Accuracy: 98.83225806451614 (+/-) 3.654000447989094 Precision: 0.9883399507157558 Recall: 0.9883225806451613 F1 score: 0.9883024013059986 Testing Time: 0.0018657774908140995 (+/-) 0.0006024151503329402 Training Time: 2.4049223580240775 (+/-) 0.06540753316067224 === Average network evolution === Total hidden node: 54.15357142857143 (+/-) 12.410594038471933 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=63, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 63 No. of parameters : 255 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=63, out_features=4, bias=True) ) No. of inputs : 63 No. of output : 4 No. of parameters : 256
100% (280 of 280) |######################| Elapsed Time: 0:11:08 ETA: 00:00:00
=== Performance result === Accuracy: 98.71971326164875 (+/-) 4.935563858242005 Precision: 0.9871943743448504 Recall: 0.9871971326164874 F1 score: 0.9871784102238991 Testing Time: 0.0017406966096611433 (+/-) 0.0005074713731817038 Training Time: 2.3918756028657318 (+/-) 0.029083390457963352 === Average network evolution === Total hidden node: 50.94642857142857 (+/-) 12.710821658921107 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=59, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 59 No. of parameters : 239 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=59, out_features=4, bias=True) ) No. of inputs : 59 No. of output : 4 No. of parameters : 240
========== Performance occupancy ========== Preq Accuracy: 98.5 (+/-) 0.33 F1 score: 0.99 (+/-) 0.0 Precision: 0.99 (+/-) 0.0 Recall: 0.99 (+/-) 0.0 Training time: 2.4 (+/-) 0.0 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 62.6 (+/-) 3.01 25% Data
100% (280 of 280) |######################| Elapsed Time: 0:09:25 ETA: 00:00:00
=== Performance result === Accuracy: 98.16594982078851 (+/-) 6.590583736689308 Precision: 0.9816542274176899 Recall: 0.9816594982078853 F1 score: 0.9816565922866788 Testing Time: 0.0017446642708180199 (+/-) 0.0005250061664078484 Training Time: 2.020343662590109 (+/-) 0.031853494458596344 === Average network evolution === Total hidden node: 45.65714285714286 (+/-) 14.114213417975627 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=57, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 57 No. of parameters : 231 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=57, out_features=4, bias=True) ) No. of inputs : 57 No. of output : 4 No. of parameters : 232
100% (280 of 280) |######################| Elapsed Time: 0:09:24 ETA: 00:00:00
=== Performance result === Accuracy: 98.15017921146952 (+/-) 7.082468003067731 Precision: 0.981488406246896 Recall: 0.9815017921146953 F1 score: 0.9814904683512408 Testing Time: 0.001727890370139939 (+/-) 0.0005045927592710244 Training Time: 2.0179740750234187 (+/-) 0.021743633146465043 === Average network evolution === Total hidden node: 45.22142857142857 (+/-) 13.783306599518365 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=57, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 57 No. of parameters : 231 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=57, out_features=4, bias=True) ) No. of inputs : 57 No. of output : 4 No. of parameters : 232
100% (280 of 280) |######################| Elapsed Time: 0:09:28 ETA: 00:00:00
=== Performance result === Accuracy: 98.65412186379929 (+/-) 4.138503859250891 Precision: 0.986524291344413 Recall: 0.9865412186379928 F1 score: 0.9865280841625699 Testing Time: 0.0018045987706885116 (+/-) 0.0004970554946503682 Training Time: 2.0328311638165544 (+/-) 0.030140800660536107 === Average network evolution === Total hidden node: 47.17142857142857 (+/-) 12.69754917698848 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=59, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 59 No. of parameters : 239 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=59, out_features=4, bias=True) ) No. of inputs : 59 No. of output : 4 No. of parameters : 240
100% (280 of 280) |######################| Elapsed Time: 0:09:28 ETA: 00:00:00
=== Performance result === Accuracy: 98.34767025089606 (+/-) 5.5838528038436985 Precision: 0.9834726024723873 Recall: 0.9834767025089606 F1 score: 0.9834384045488033 Testing Time: 0.0017111925241340446 (+/-) 0.0004989632509568414 Training Time: 2.031305744656525 (+/-) 0.030469042538728366 === Average network evolution === Total hidden node: 46.01428571428571 (+/-) 12.231391530627667 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=57, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 57 No. of parameters : 231 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=57, out_features=4, bias=True) ) No. of inputs : 57 No. of output : 4 No. of parameters : 232
100% (280 of 280) |######################| Elapsed Time: 0:09:26 ETA: 00:00:00
=== Performance result === Accuracy: 98.468458781362 (+/-) 4.7735616112276515 Precision: 0.984667108653198 Recall: 0.9846845878136201 F1 score: 0.9846685460334352 Testing Time: 0.0019368065728081597 (+/-) 0.003451441668091678 Training Time: 2.0254308382670083 (+/-) 0.0474238403005755 === Average network evolution === Total hidden node: 40.92142857142857 (+/-) 11.315329587487472 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=51, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 51 No. of parameters : 207 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=51, out_features=4, bias=True) ) No. of inputs : 51 No. of output : 4 No. of parameters : 208
N/A% (0 of 280) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 98.36 (+/-) 0.19 F1 score: 0.98 (+/-) 0.0 Precision: 0.98 (+/-) 0.0 Recall: 0.98 (+/-) 0.0 Training time: 2.03 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 56.2 (+/-) 2.71 Infinite Delay
100% (280 of 280) |######################| Elapsed Time: 0:07:44 ETA: 00:00:00
=== Performance result === Accuracy: 31.410394265232974 (+/-) 9.706964550077153
C:\Users\SCSE\AppData\Local\Continuum\miniconda3\envs\stmicro\lib\site-packages\sklearn\metrics\_classification.py:1221: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result))
Precision: 0.34188549776389554 Recall: 0.31410394265232977 F1 score: 0.2017385057028227 Testing Time: 0.0015167215818999917 (+/-) 0.0005152670757772471 Training Time: 1.6487033358611514 (+/-) 0.0353181638411217 === Average network evolution === Total hidden node: 7.7785714285714285 (+/-) 7.7785714285714285 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 9 No. of parameters : 39 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=4, bias=True) ) No. of inputs : 9 No. of output : 4 No. of parameters : 40
100% (280 of 280) |######################| Elapsed Time: 0:07:54 ETA: 00:00:00
=== Performance result === Accuracy: 52.41720430107527 (+/-) 7.246118085534872 Precision: 0.4135975469002727 Recall: 0.5241720430107527 F1 score: 0.44654189939745154 Testing Time: 0.0014444363159945362 (+/-) 0.0005333543502286279 Training Time: 1.683700058195326 (+/-) 0.07182905324917416 === Average network evolution === Total hidden node: 6.9714285714285715 (+/-) 6.9714285714285715 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=6, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 6 No. of parameters : 27 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=6, out_features=4, bias=True) ) No. of inputs : 6 No. of output : 4 No. of parameters : 28
100% (280 of 280) |######################| Elapsed Time: 0:07:45 ETA: 00:00:00
=== Performance result === Accuracy: 39.24910394265233 (+/-) 4.554841953195149 Precision: 0.43445530392104825 Recall: 0.3924910394265233 F1 score: 0.30389192686221667 Testing Time: 0.0014920311589394846 (+/-) 0.0005200306165009971 Training Time: 1.652131998410789 (+/-) 0.03426592415383448 === Average network evolution === Total hidden node: 9.153571428571428 (+/-) 9.153571428571428 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 9 No. of parameters : 39 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=4, bias=True) ) No. of inputs : 9 No. of output : 4 No. of parameters : 40
100% (280 of 280) |######################| Elapsed Time: 0:07:44 ETA: 00:00:00
=== Performance result === Accuracy: 34.983870967741936 (+/-) 4.526274318814219 Precision: 0.4266029024844088 Recall: 0.34983870967741937 F1 score: 0.32155222104565057 Testing Time: 0.0014608983070619644 (+/-) 0.0005224629804787121 Training Time: 1.6477424824964189 (+/-) 0.023482086403910286 === Average network evolution === Total hidden node: 8.128571428571428 (+/-) 8.128571428571428 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=8, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 8 No. of parameters : 35 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=8, out_features=4, bias=True) ) No. of inputs : 8 No. of output : 4 No. of parameters : 36
100% (280 of 280) |######################| Elapsed Time: 0:07:44 ETA: 00:00:00
=== Performance result === Accuracy: 51.445519713261646 (+/-) 4.475905987318934 Precision: 0.6126508207389302 Recall: 0.5144551971326164 F1 score: 0.43742845272802844 Testing Time: 0.0015578680140997773 (+/-) 0.0022953760685736142 Training Time: 1.650023498842793 (+/-) 0.025702613646082263 === Average network evolution === Total hidden node: 13.989285714285714 (+/-) 13.989285714285714 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 14 No. of parameters : 59 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=4, bias=True) ) No. of inputs : 14 No. of output : 4 No. of parameters : 60
========== Performance occupancy ========== Preq Accuracy: 41.9 (+/-) 8.56 F1 score: 0.34 (+/-) 0.09 Precision: 0.45 (+/-) 0.09 Recall: 0.42 (+/-) 0.09 Training time: 1.66 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 9.2 (+/-) 2.64
%run DEVDAN_occupancy.ipynb
Number of input: 5 Number of output: 2 Number of batch: 20 All Data
100% (20 of 20) |########################| Elapsed Time: 0:00:59 ETA: 00:00:00
=== Performance result === Accuracy: 94.12631578947368 (+/-) 12.167681183116247 Precision: 0.9417026075418174 Recall: 0.9412631578947368 F1 score: 0.9392022982421169 Testing Time: 0.001730053048384817 (+/-) 0.0004345990837267352 Training Time: 3.1115188347665885 (+/-) 0.02442550053832812 === Average network evolution === Total hidden node: 28.25 (+/-) 11.932623349456732 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=44, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 44 No. of parameters : 269 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=44, out_features=2, bias=True) ) No. of inputs : 44 No. of output : 2 No. of parameters : 90
100% (20 of 20) |########################| Elapsed Time: 0:01:00 ETA: 00:00:00
=== Performance result === Accuracy: 93.81052631578946 (+/-) 12.164049478812215 Precision: 0.9382032792867195 Recall: 0.9381052631578948 F1 score: 0.9359791678273631 Testing Time: 0.0019296093990928249 (+/-) 0.0003916776449929711 Training Time: 3.159525996760318 (+/-) 0.052620773895586975 === Average network evolution === Total hidden node: 49.5 (+/-) 12.031209415515965 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=66, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 66 No. of parameters : 401 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=66, out_features=2, bias=True) ) No. of inputs : 66 No. of output : 2 No. of parameters : 134
100% (20 of 20) |########################| Elapsed Time: 0:00:59 ETA: 00:00:00
=== Performance result === Accuracy: 91.48947368421054 (+/-) 12.658340073137639 Precision: 0.9129173208274038 Recall: 0.9148947368421053 F1 score: 0.9130112959570756 Testing Time: 0.0017166765112625925 (+/-) 0.0005482329221608689 Training Time: 3.138629072590878 (+/-) 0.04074785126466747 === Average network evolution === Total hidden node: 31.9 (+/-) 11.022250223978768 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=47, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 47 No. of parameters : 287 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=47, out_features=2, bias=True) ) No. of inputs : 47 No. of output : 2 No. of parameters : 96
100% (20 of 20) |########################| Elapsed Time: 0:00:59 ETA: 00:00:00
=== Performance result === Accuracy: 94.10526315789473 (+/-) 12.20769704063838 Precision: 0.940918058689336 Recall: 0.9410526315789474 F1 score: 0.9392777195936105 Testing Time: 0.0016643248106303968 (+/-) 0.0005660011983113068 Training Time: 3.1298930394022086 (+/-) 0.028842353283985152 === Average network evolution === Total hidden node: 44.85 (+/-) 12.426886174742247 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=61, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 61 No. of parameters : 371 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=61, out_features=2, bias=True) ) No. of inputs : 61 No. of output : 2 No. of parameters : 124
100% (20 of 20) |########################| Elapsed Time: 0:00:59 ETA: 00:00:00
=== Performance result === Accuracy: 92.38421052631578 (+/-) 13.75856721265526 Precision: 0.924750514932838 Recall: 0.9238421052631579 F1 score: 0.9200112410838851 Testing Time: 0.0017785524067125823 (+/-) 0.0005197186644303318 Training Time: 3.1248184003328023 (+/-) 0.024422274058000282 === Average network evolution === Total hidden node: 40.9 (+/-) 12.93792873685738 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=58, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 58 No. of parameters : 353 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=58, out_features=2, bias=True) ) No. of inputs : 58 No. of output : 2 No. of parameters : 118
========== Performance occupancy ========== Preq Accuracy: 93.18 (+/-) 1.06 F1 score: 0.93 (+/-) 0.01 Precision: 0.93 (+/-) 0.01 Recall: 0.93 (+/-) 0.01 Training time: 3.13 (+/-) 0.02 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 55.2 (+/-) 8.38 50% Data
100% (20 of 20) |########################| Elapsed Time: 0:00:45 ETA: 00:00:00
=== Performance result === Accuracy: 87.60526315789473 (+/-) 16.072515726321537 Precision: 0.872243696469172 Recall: 0.8760526315789474 F1 score: 0.8733691977334447 Testing Time: 0.001734093615883275 (+/-) 0.0005333696538404918 Training Time: 2.3769333864513196 (+/-) 0.006202818252542381 === Average network evolution === Total hidden node: 38.45 (+/-) 12.839295151993353 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=55, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 55 No. of parameters : 335 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=55, out_features=2, bias=True) ) No. of inputs : 55 No. of output : 2 No. of parameters : 112
100% (20 of 20) |########################| Elapsed Time: 0:00:45 ETA: 00:00:00
=== Performance result === Accuracy: 91.91052631578947 (+/-) 12.283274667819528 Precision: 0.9173429821100337 Recall: 0.9191052631578948 F1 score: 0.917408148397425 Testing Time: 0.0016140435871325042 (+/-) 0.0004898602271977422 Training Time: 2.3798038081118933 (+/-) 0.013616793705851036 === Average network evolution === Total hidden node: 32.3 (+/-) 10.654107189248661 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=46, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 46 No. of parameters : 281 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=46, out_features=2, bias=True) ) No. of inputs : 46 No. of output : 2 No. of parameters : 94
100% (20 of 20) |########################| Elapsed Time: 0:00:45 ETA: 00:00:00
=== Performance result === Accuracy: 87.36842105263158 (+/-) 17.672460068574104 Precision: 0.8700473239695742 Recall: 0.8736842105263158 F1 score: 0.8643250802029355 Testing Time: 0.0018304147218403063 (+/-) 0.0005872034986080351 Training Time: 2.3866209230924906 (+/-) 0.021451343352490174 === Average network evolution === Total hidden node: 31.65 (+/-) 12.301524295793591 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=46, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 46 No. of parameters : 281 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=46, out_features=2, bias=True) ) No. of inputs : 46 No. of output : 2 No. of parameters : 94
100% (20 of 20) |########################| Elapsed Time: 0:00:45 ETA: 00:00:00
=== Performance result === Accuracy: 87.89473684210527 (+/-) 14.350773816811211 Precision: 0.8749488467663327 Recall: 0.8789473684210526 F1 score: 0.8715753736179883 Testing Time: 0.0015674013840524775 (+/-) 0.0004904560890028188 Training Time: 2.3863525641591927 (+/-) 0.02658769584352912 === Average network evolution === Total hidden node: 26.3 (+/-) 11.506954418958998 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=41, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 41 No. of parameters : 251 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=41, out_features=2, bias=True) ) No. of inputs : 41 No. of output : 2 No. of parameters : 84
100% (20 of 20) |########################| Elapsed Time: 0:00:45 ETA: 00:00:00
=== Performance result === Accuracy: 87.34210526315789 (+/-) 17.994452823300083 Precision: 0.869182135321153 Recall: 0.873421052631579 F1 score: 0.8647136210766404 Testing Time: 0.001662982137579667 (+/-) 0.0005698806249189355 Training Time: 2.3986512485303377 (+/-) 0.035002228841415406 === Average network evolution === Total hidden node: 33.25 (+/-) 12.259180233604528 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=49, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 49 No. of parameters : 299 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=49, out_features=2, bias=True) ) No. of inputs : 49 No. of output : 2 No. of parameters : 100
========== Performance occupancy ========== Preq Accuracy: 88.42 (+/-) 1.75 F1 score: 0.88 (+/-) 0.02 Precision: 0.88 (+/-) 0.02 Recall: 0.88 (+/-) 0.02 Training time: 2.39 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 47.4 (+/-) 4.59 25% Data
100% (20 of 20) |########################| Elapsed Time: 0:00:38 ETA: 00:00:00
=== Performance result === Accuracy: 91.0 (+/-) 13.084703805750076 Precision: 0.9095697739830984 Recall: 0.91 F1 score: 0.9097721917869059 Testing Time: 0.0017238541653281764 (+/-) 0.00043059539875961694 Training Time: 2.042232450686003 (+/-) 0.04117742082942164 === Average network evolution === Total hidden node: 27.05 (+/-) 8.605085705558079 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=40, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 40 No. of parameters : 245 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=40, out_features=2, bias=True) ) No. of inputs : 40 No. of output : 2 No. of parameters : 82
100% (20 of 20) |########################| Elapsed Time: 0:00:38 ETA: 00:00:00
=== Performance result === Accuracy: 89.41578947368419 (+/-) 13.440130918766133 Precision: 0.8918804902691551 Recall: 0.8941578947368422 F1 score: 0.8926570755427262 Testing Time: 0.0016226015592876234 (+/-) 0.0005843436607858528 Training Time: 2.0351304756967643 (+/-) 0.029165551791218214 === Average network evolution === Total hidden node: 21.0 (+/-) 8.573214099741124 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=33, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 33 No. of parameters : 203 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=33, out_features=2, bias=True) ) No. of inputs : 33 No. of output : 2 No. of parameters : 68
100% (20 of 20) |########################| Elapsed Time: 0:00:38 ETA: 00:00:00
=== Performance result === Accuracy: 86.09473684210526 (+/-) 16.149155874443213 Precision: 0.8541025071575736 Recall: 0.8609473684210527 F1 score: 0.8535046581699114 Testing Time: 0.001569358926070364 (+/-) 0.0004944556420625095 Training Time: 2.032375875272249 (+/-) 0.02918430322289312 === Average network evolution === Total hidden node: 25.05 (+/-) 7.003392035292612 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=35, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 35 No. of parameters : 215 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=35, out_features=2, bias=True) ) No. of inputs : 35 No. of output : 2 No. of parameters : 72
100% (20 of 20) |########################| Elapsed Time: 0:00:38 ETA: 00:00:00
=== Performance result === Accuracy: 86.69473684210526 (+/-) 17.53677260364063 Precision: 0.8613312539315904 Recall: 0.8669473684210526 F1 score: 0.8582285895025581 Testing Time: 0.0017787280835603412 (+/-) 0.00041543628684626585 Training Time: 2.0263755321502686 (+/-) 0.03199586071753423 === Average network evolution === Total hidden node: 28.2 (+/-) 10.380751417888783 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=41, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 41 No. of parameters : 251 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=41, out_features=2, bias=True) ) No. of inputs : 41 No. of output : 2 No. of parameters : 84
100% (20 of 20) |########################| Elapsed Time: 0:00:38 ETA: 00:00:00
=== Performance result === Accuracy: 84.62631578947368 (+/-) 15.846628135622357 Precision: 0.8378515269468836 Recall: 0.8462631578947368 F1 score: 0.8389896876223986 Testing Time: 0.0017660918988679584 (+/-) 0.0005190602958871454 Training Time: 2.0120899049859298 (+/-) 0.01363160559605076 === Average network evolution === Total hidden node: 29.6 (+/-) 11.547294055318762 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=44, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 44 No. of parameters : 269 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=44, out_features=2, bias=True) ) No. of inputs : 44 No. of output : 2 No. of parameters : 90
N/A% (0 of 20) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 87.57 (+/-) 2.31 F1 score: 0.87 (+/-) 0.03 Precision: 0.87 (+/-) 0.03 Recall: 0.88 (+/-) 0.02 Training time: 2.03 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 38.6 (+/-) 4.03 Infinite Delay
100% (20 of 20) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 91.37894736842105 (+/-) 11.173068133845138 Precision: 0.9163031494420744 Recall: 0.9137894736842105 F1 score: 0.9080525272733756 Testing Time: 0.0016627060739617598 (+/-) 0.0005659536710162378 Training Time: 1.6492875249762284 (+/-) 0.025503638534587997 === Average network evolution === Total hidden node: 14.0 (+/-) 14.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 14 No. of parameters : 89 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (20 of 20) |########################| Elapsed Time: 0:00:36 ETA: 00:00:00
=== Performance result === Accuracy: 91.4842105263158 (+/-) 12.722845724290604 Precision: 0.9169503406266513 Recall: 0.9148421052631579 F1 score: 0.9094121978720788 Testing Time: 0.0016207318556936163 (+/-) 0.0005768451525426626 Training Time: 1.7520778806586015 (+/-) 0.10397831349668323 === Average network evolution === Total hidden node: 14.9 (+/-) 14.9 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 15 No. of parameters : 95 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
100% (20 of 20) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 94.70526315789473 (+/-) 5.304960308816303 Precision: 0.9473117368341452 Recall: 0.9470526315789474 F1 score: 0.9471711677484661 Testing Time: 0.0017188222784745065 (+/-) 0.0005392224086763533 Training Time: 1.6505292591295744 (+/-) 0.018209329865814614 === Average network evolution === Total hidden node: 20.85 (+/-) 20.85 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=21, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 21 No. of parameters : 131 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=21, out_features=2, bias=True) ) No. of inputs : 21 No. of output : 2 No. of parameters : 44
100% (20 of 20) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 92.78421052631579 (+/-) 9.36484260456684 Precision: 0.9301252196977988 Recall: 0.9278421052631579 F1 score: 0.9238726997525788 Testing Time: 0.001508173189665142 (+/-) 0.0005970647588017396 Training Time: 1.648397633903905 (+/-) 0.010864549638751405 === Average network evolution === Total hidden node: 14.8 (+/-) 14.8 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 15 No. of parameters : 95 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
100% (20 of 20) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 90.29473684210525 (+/-) 13.747668651513266 Precision: 0.907797686145002 Recall: 0.9029473684210526 F1 score: 0.8946544769148023 Testing Time: 0.0017725793938887747 (+/-) 0.000697708214450811 Training Time: 1.6471675571642423 (+/-) 0.00993062416170704 === Average network evolution === Total hidden node: 15.0 (+/-) 15.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 15 No. of parameters : 95 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
========== Performance occupancy ========== Preq Accuracy: 92.13 (+/-) 1.51 F1 score: 0.92 (+/-) 0.02 Precision: 0.92 (+/-) 0.01 Recall: 0.92 (+/-) 0.02 Training time: 1.67 (+/-) 0.04 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 16.0 (+/-) 2.53
%run DEVDAN_creditcarddefault.ipynb
Number of input: 24 Number of output: 2 Number of batch: 30 All Data
100% (30 of 30) |########################| Elapsed Time: 0:01:31 ETA: 00:00:00
=== Performance result === Accuracy: 79.27931034482758 (+/-) 4.68279163057788 Precision: 0.7618905682479733 Recall: 0.7927931034482759 F1 score: 0.7569161211148352 Testing Time: 0.0024629214714313374 (+/-) 0.0006169476935006807 Training Time: 3.1432375743471344 (+/-) 0.03728141552273546 === Average network evolution === Total hidden node: 22.766666666666666 (+/-) 0.42295258468165065 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=23, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 23 No. of parameters : 599 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=23, out_features=2, bias=True) ) No. of inputs : 23 No. of output : 2 No. of parameters : 48
100% (30 of 30) |########################| Elapsed Time: 0:01:30 ETA: 00:00:00
=== Performance result === Accuracy: 79.9896551724138 (+/-) 2.665920922280433 Precision: 0.7806855217681604 Recall: 0.799896551724138 F1 score: 0.7498974932146996 Testing Time: 0.002342372105039399 (+/-) 0.0006770066891784815 Training Time: 3.124590010478579 (+/-) 0.05100577509587114 === Average network evolution === Total hidden node: 8.4 (+/-) 0.6110100926607787 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 9 No. of parameters : 249 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=2, bias=True) ) No. of inputs : 9 No. of output : 2 No. of parameters : 20
100% (30 of 30) |########################| Elapsed Time: 0:01:31 ETA: 00:00:00
=== Performance result === Accuracy: 80.18275862068967 (+/-) 2.450062485553243 Precision: 0.7816316554281255 Recall: 0.8018275862068965 F1 score: 0.7562892714526392 Testing Time: 0.0025006080495900123 (+/-) 0.0005592800434525817 Training Time: 3.1581902175114074 (+/-) 0.050302436396429756 === Average network evolution === Total hidden node: 25.066666666666666 (+/-) 4.781445620544295 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=27, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 27 No. of parameters : 699 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=27, out_features=2, bias=True) ) No. of inputs : 27 No. of output : 2 No. of parameters : 56
100% (30 of 30) |########################| Elapsed Time: 0:01:31 ETA: 00:00:00
=== Performance result === Accuracy: 80.54482758620689 (+/-) 2.255810307336873 Precision: 0.7847502637341663 Recall: 0.805448275862069 F1 score: 0.7670546648919138 Testing Time: 0.0023213912700784617 (+/-) 0.0004731759122633451 Training Time: 3.151699690983213 (+/-) 0.02857634124793602 === Average network evolution === Total hidden node: 25.666666666666668 (+/-) 1.0434983894999017 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=26, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 26 No. of parameters : 674 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=26, out_features=2, bias=True) ) No. of inputs : 26 No. of output : 2 No. of parameters : 54
100% (30 of 30) |########################| Elapsed Time: 0:01:31 ETA: 00:00:00
=== Performance result === Accuracy: 78.96206896551725 (+/-) 6.299612127802868 Precision: 0.7566961942964026 Recall: 0.7896206896551724 F1 score: 0.7535155260718467 Testing Time: 0.0024283425561312973 (+/-) 0.0006159721072666875 Training Time: 3.1666212739615607 (+/-) 0.07474227434558949 === Average network evolution === Total hidden node: 24.966666666666665 (+/-) 4.771326393735346 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=27, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 27 No. of parameters : 699 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=27, out_features=2, bias=True) ) No. of inputs : 27 No. of output : 2 No. of parameters : 56
========== Performance occupancy ========== Preq Accuracy: 79.79 (+/-) 0.58 F1 score: 0.76 (+/-) 0.01 Precision: 0.77 (+/-) 0.01 Recall: 0.8 (+/-) 0.01 Training time: 3.15 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 22.4 (+/-) 6.86 50% Data
100% (30 of 30) |########################| Elapsed Time: 0:01:10 ETA: 00:00:00
=== Performance result === Accuracy: 80.20689655172414 (+/-) 2.520526906392824 Precision: 0.7795456061693482 Recall: 0.8020689655172414 F1 score: 0.7606792104779698 Testing Time: 0.0022929208032016098 (+/-) 0.00046592287797755645 Training Time: 2.4224132751596383 (+/-) 0.03953185063739474 === Average network evolution === Total hidden node: 26.166666666666668 (+/-) 1.5293426329272615 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=28, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 28 No. of parameters : 724 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=28, out_features=2, bias=True) ) No. of inputs : 28 No. of output : 2 No. of parameters : 58
100% (30 of 30) |########################| Elapsed Time: 0:01:09 ETA: 00:00:00
=== Performance result === Accuracy: 80.15172413793104 (+/-) 2.4471828546106065 Precision: 0.7789380637720327 Recall: 0.8015172413793104 F1 score: 0.7590956617645126 Testing Time: 0.0026160684125176793 (+/-) 0.0005378171657308437 Training Time: 2.401319273586931 (+/-) 0.030069161775021894 === Average network evolution === Total hidden node: 25.366666666666667 (+/-) 0.7063206700139029 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=26, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 26 No. of parameters : 674 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=26, out_features=2, bias=True) ) No. of inputs : 26 No. of output : 2 No. of parameters : 54
100% (30 of 30) |########################| Elapsed Time: 0:01:09 ETA: 00:00:00
=== Performance result === Accuracy: 77.73793103448274 (+/-) 9.88542688430292 Precision: 0.7446404420428647 Recall: 0.7773793103448275 F1 score: 0.750867424694555 Testing Time: 0.002423516635237069 (+/-) 0.0005578814949790215 Training Time: 2.404156240923651 (+/-) 0.019416307709653528 === Average network evolution === Total hidden node: 33.666666666666664 (+/-) 1.5986105077709067 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=35, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 35 No. of parameters : 899 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=35, out_features=2, bias=True) ) No. of inputs : 35 No. of output : 2 No. of parameters : 72
100% (30 of 30) |########################| Elapsed Time: 0:01:11 ETA: 00:00:00
=== Performance result === Accuracy: 78.37241379310343 (+/-) 6.441401461971805 Precision: 0.7503588333699214 Recall: 0.7837241379310345 F1 score: 0.7532750559943628 Testing Time: 0.0025039541310277478 (+/-) 0.0005661124779998831 Training Time: 2.4464977691913474 (+/-) 0.0789341616322422 === Average network evolution === Total hidden node: 24.4 (+/-) 0.9865765724632497 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=25, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 25 No. of parameters : 649 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=25, out_features=2, bias=True) ) No. of inputs : 25 No. of output : 2 No. of parameters : 52
100% (30 of 30) |########################| Elapsed Time: 0:01:10 ETA: 00:00:00
=== Performance result === Accuracy: 80.10000000000001 (+/-) 2.190575424324662 Precision: 0.7773100028022603 Recall: 0.801 F1 score: 0.7597471609441068 Testing Time: 0.0023942980273016566 (+/-) 0.0005584794959237563 Training Time: 2.4142231283516717 (+/-) 0.03093462727078375 === Average network evolution === Total hidden node: 24.666666666666668 (+/-) 1.2995725793078614 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=26, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 26 No. of parameters : 674 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=26, out_features=2, bias=True) ) No. of inputs : 26 No. of output : 2 No. of parameters : 54
========== Performance occupancy ========== Preq Accuracy: 79.31 (+/-) 1.05 F1 score: 0.76 (+/-) 0.0 Precision: 0.77 (+/-) 0.02 Recall: 0.79 (+/-) 0.01 Training time: 2.42 (+/-) 0.02 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 28.0 (+/-) 3.63 25% Data
100% (30 of 30) |########################| Elapsed Time: 0:00:58 ETA: 00:00:00
=== Performance result === Accuracy: 77.17241379310344 (+/-) 7.595774530969636 Precision: 0.740289132036049 Recall: 0.7717241379310344 F1 score: 0.7482377080058107 Testing Time: 0.002598885832161739 (+/-) 0.0006693018373427925 Training Time: 2.029159373250501 (+/-) 0.014429045474099001 === Average network evolution === Total hidden node: 30.633333333333333 (+/-) 1.3535960336164639 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=31, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 31 No. of parameters : 799 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=31, out_features=2, bias=True) ) No. of inputs : 31 No. of output : 2 No. of parameters : 64
100% (30 of 30) |########################| Elapsed Time: 0:00:59 ETA: 00:00:00
=== Performance result === Accuracy: 77.54827586206896 (+/-) 7.823731391696506 Precision: 0.7406180403943232 Recall: 0.7754827586206896 F1 score: 0.7469928777515342 Testing Time: 0.002497500386731378 (+/-) 0.0005028338364391314 Training Time: 2.032955556080259 (+/-) 0.022217459568040734 === Average network evolution === Total hidden node: 23.533333333333335 (+/-) 1.0241527663824812 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=25, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 25 No. of parameters : 649 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=25, out_features=2, bias=True) ) No. of inputs : 25 No. of output : 2 No. of parameters : 52
100% (30 of 30) |########################| Elapsed Time: 0:00:59 ETA: 00:00:00
=== Performance result === Accuracy: 78.45862068965515 (+/-) 5.140207780677494 Precision: 0.7464460571405379 Recall: 0.7845862068965517 F1 score: 0.7424503830745596 Testing Time: 0.002570925087764345 (+/-) 0.0004913868311809277 Training Time: 2.036432693744528 (+/-) 0.02734745138627772 === Average network evolution === Total hidden node: 21.133333333333333 (+/-) 1.4996295838935993 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=23, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 23 No. of parameters : 599 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=23, out_features=2, bias=True) ) No. of inputs : 23 No. of output : 2 No. of parameters : 48
100% (30 of 30) |########################| Elapsed Time: 0:00:59 ETA: 00:00:00
=== Performance result === Accuracy: 76.55862068965517 (+/-) 11.413649337567335 Precision: 0.7236079411314116 Recall: 0.7655862068965518 F1 score: 0.7325835189173595 Testing Time: 0.002711345409524852 (+/-) 0.0007416231357634427 Training Time: 2.0496684600566994 (+/-) 0.04864396345403809 === Average network evolution === Total hidden node: 39.03333333333333 (+/-) 5.896232318655325 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=41, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 41 No. of parameters : 1049 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=41, out_features=2, bias=True) ) No. of inputs : 41 No. of output : 2 No. of parameters : 84
100% (30 of 30) |########################| Elapsed Time: 0:00:59 ETA: 00:00:00
=== Performance result === Accuracy: 76.39310344827587 (+/-) 7.611331436133744 Precision: 0.7298635229844239 Recall: 0.7639310344827587 F1 score: 0.7393716528395402 Testing Time: 0.0026394580972605736 (+/-) 0.000545270909923563 Training Time: 2.0439424103703994 (+/-) 0.036267434600251336 === Average network evolution === Total hidden node: 29.366666666666667 (+/-) 4.1027091320519204 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=31, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 31 No. of parameters : 799 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=31, out_features=2, bias=True) ) No. of inputs : 31 No. of output : 2 No. of parameters : 64
N/A% (0 of 30) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 77.23 (+/-) 0.74 F1 score: 0.74 (+/-) 0.01 Precision: 0.74 (+/-) 0.01 Recall: 0.77 (+/-) 0.01 Training time: 2.04 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 30.2 (+/-) 6.27 Infinite Delay
100% (30 of 30) |########################| Elapsed Time: 0:00:51 ETA: 00:00:00
=== Performance result === Accuracy: 77.88620689655173 (+/-) 2.5099421806681654 Precision: 0.769578268282787 Recall: 0.7788620689655172 F1 score: 0.6826767469203531 Testing Time: 0.0031945869840424635 (+/-) 0.0007930072946805932 Training Time: 1.6654956916282917 (+/-) 0.02515726468773376 === Average network evolution === Total hidden node: 16.733333333333334 (+/-) 16.733333333333334 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=19, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 19 No. of parameters : 499 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=19, out_features=2, bias=True) ) No. of inputs : 19 No. of output : 2 No. of parameters : 40
100% (30 of 30) |########################| Elapsed Time: 0:00:51 ETA: 00:00:00
=== Performance result === Accuracy: 77.86551724137932 (+/-) 2.5064435392044997 Precision: 0.7538825779066591 Recall: 0.7786551724137931 F1 score: 0.6820554864008568 Testing Time: 0.002398359364476697 (+/-) 0.0005649736454469162 Training Time: 1.655294706081522 (+/-) 0.013172774831785014 === Average network evolution === Total hidden node: 19.966666666666665 (+/-) 19.966666666666665 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=20, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 20 No. of parameters : 524 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=20, out_features=2, bias=True) ) No. of inputs : 20 No. of output : 2 No. of parameters : 42
100% (30 of 30) |########################| Elapsed Time: 0:00:52 ETA: 00:00:00
=== Performance result === Accuracy: 24.075862068965524 (+/-) 10.494092103866029 Precision: 0.6563211379310345 Recall: 0.24075862068965517 F1 score: 0.13131070184378932 Testing Time: 0.003325503447960163 (+/-) 0.000653637931636334 Training Time: 1.6800545741771828 (+/-) 0.027289111519297093 === Average network evolution === Total hidden node: 19.2 (+/-) 19.2 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=20, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 20 No. of parameters : 524 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=20, out_features=2, bias=True) ) No. of inputs : 20 No. of output : 2 No. of parameters : 42
100% (30 of 30) |########################| Elapsed Time: 0:00:51 ETA: 00:00:00C:\Users\SCSE\AppData\Local\Continuum\miniconda3\envs\stmicro\lib\site-packages\sklearn\metrics\_classification.py:1221: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result))
=== Performance result === Accuracy: 77.85517241379311 (+/-) 2.505247762115322 Precision: 0.6061427871581452 Recall: 0.7785517241379311 F1 score: 0.6816138984678044 Testing Time: 0.0025337400107548155 (+/-) 0.0005621418010552423 Training Time: 1.6651170089327056 (+/-) 0.017723554261637457 === Average network evolution === Total hidden node: 21.3 (+/-) 21.3 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=22, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 22 No. of parameters : 574 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=22, out_features=2, bias=True) ) No. of inputs : 22 No. of output : 2 No. of parameters : 46
100% (30 of 30) |########################| Elapsed Time: 0:00:52 ETA: 00:00:00
=== Performance result === Accuracy: 77.94827586206897 (+/-) 2.4547800926377215 Precision: 0.7587303163523433 Recall: 0.7794827586206896 F1 score: 0.6852941571748169 Testing Time: 0.0023812507760935814 (+/-) 0.0004770358391224454 Training Time: 1.6858411904039055 (+/-) 0.03339943032518744 === Average network evolution === Total hidden node: 30.133333333333333 (+/-) 30.133333333333333 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=30, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 30 No. of parameters : 774 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=30, out_features=2, bias=True) ) No. of inputs : 30 No. of output : 2 No. of parameters : 62
========== Performance occupancy ========== Preq Accuracy: 67.13 (+/-) 21.53 F1 score: 0.57 (+/-) 0.22 Precision: 0.71 (+/-) 0.07 Recall: 0.67 (+/-) 0.22 Training time: 1.67 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 22.2 (+/-) 4.02
%run DEVDAN_electricitypricing.ipynb
Number of input: 8 Number of output: 2 Number of batch: 45 All Data
100% (45 of 45) |########################| Elapsed Time: 0:02:18 ETA: 00:00:00
=== Performance result === Accuracy: 69.11590909090908 (+/-) 7.438137082930295 Precision: 0.6878805969849514 Recall: 0.6911590909090909 F1 score: 0.6866115653008054 Testing Time: 0.001915552399375222 (+/-) 0.000444923711236873 Training Time: 3.1361293467608364 (+/-) 0.06438397107471447 === Average network evolution === Total hidden node: 14.066666666666666 (+/-) 2.4073960113690385 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=16, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 16 No. of parameters : 152 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=16, out_features=2, bias=True) ) No. of inputs : 16 No. of output : 2 No. of parameters : 34
100% (45 of 45) |########################| Elapsed Time: 0:02:19 ETA: 00:00:00
=== Performance result === Accuracy: 69.0340909090909 (+/-) 7.27101790991337 Precision: 0.687664634619826 Recall: 0.6903409090909091 F1 score: 0.688024974848844 Testing Time: 0.0020025480877269397 (+/-) 0.0005411429858569964 Training Time: 3.1674478433348914 (+/-) 0.09279921846311937 === Average network evolution === Total hidden node: 14.822222222222223 (+/-) 0.8243216440440626 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 15 No. of parameters : 143 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
100% (45 of 45) |########################| Elapsed Time: 0:02:17 ETA: 00:00:00
=== Performance result === Accuracy: 69.36136363636363 (+/-) 7.098342185106287 Precision: 0.6920662772193343 Recall: 0.6936136363636364 F1 score: 0.6926035350075777 Testing Time: 0.0017343976280905983 (+/-) 0.0005686697325892421 Training Time: 3.1221403642134233 (+/-) 0.030190856528502532 === Average network evolution === Total hidden node: 19.044444444444444 (+/-) 2.1077165691341113 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=22, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 22 No. of parameters : 206 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=22, out_features=2, bias=True) ) No. of inputs : 22 No. of output : 2 No. of parameters : 46
100% (45 of 45) |########################| Elapsed Time: 0:02:18 ETA: 00:00:00
=== Performance result === Accuracy: 68.37272727272726 (+/-) 6.213342082543882 Precision: 0.680821048785843 Recall: 0.6837272727272727 F1 score: 0.6811806717059659 Testing Time: 0.0016485831954262474 (+/-) 0.00047600237412088916 Training Time: 3.133968087759885 (+/-) 0.04957929545266326 === Average network evolution === Total hidden node: 9.0 (+/-) 1.3824294235551815 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 10 No. of parameters : 98 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
100% (45 of 45) |########################| Elapsed Time: 0:02:17 ETA: 00:00:00
=== Performance result === Accuracy: 68.13409090909092 (+/-) 7.3799773090873835 Precision: 0.6776965307277131 Recall: 0.681340909090909 F1 score: 0.6749474506960611 Testing Time: 0.0018241947347467596 (+/-) 0.0005223532411701956 Training Time: 3.1143344965848057 (+/-) 0.021943364230313307 === Average network evolution === Total hidden node: 16.91111111111111 (+/-) 2.229321991819458 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=19, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 19 No. of parameters : 179 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=19, out_features=2, bias=True) ) No. of inputs : 19 No. of output : 2 No. of parameters : 40
========== Performance occupancy ========== Preq Accuracy: 68.8 (+/-) 0.47 F1 score: 0.68 (+/-) 0.01 Precision: 0.69 (+/-) 0.01 Recall: 0.69 (+/-) 0.0 Training time: 3.13 (+/-) 0.02 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 16.4 (+/-) 4.03 50% Data
100% (45 of 45) |########################| Elapsed Time: 0:01:44 ETA: 00:00:00
=== Performance result === Accuracy: 66.88636363636363 (+/-) 6.905320442472105 Precision: 0.6651406409466315 Recall: 0.6688636363636363 F1 score: 0.6586112862639065 Testing Time: 0.0017536607655611906 (+/-) 0.00041885168995711003 Training Time: 2.377365079793063 (+/-) 0.018819410110174538 === Average network evolution === Total hidden node: 11.977777777777778 (+/-) 1.4218749576041236 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=13, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 13 No. of parameters : 125 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=13, out_features=2, bias=True) ) No. of inputs : 13 No. of output : 2 No. of parameters : 28
100% (45 of 45) |########################| Elapsed Time: 0:01:45 ETA: 00:00:00
=== Performance result === Accuracy: 64.78636363636362 (+/-) 7.159564197919608 Precision: 0.6449983547479394 Recall: 0.6478636363636363 F1 score: 0.6458557794742396 Testing Time: 0.0018013607371937144 (+/-) 0.0004383535241080121 Training Time: 2.3839974891055715 (+/-) 0.03556945042333179 === Average network evolution === Total hidden node: 8.71111111111111 (+/-) 0.6539528430916515 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 9 No. of parameters : 89 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=2, bias=True) ) No. of inputs : 9 No. of output : 2 No. of parameters : 20
100% (45 of 45) |########################| Elapsed Time: 0:01:45 ETA: 00:00:00
=== Performance result === Accuracy: 67.31136363636364 (+/-) 5.878010482417071 Precision: 0.6690423295884627 Recall: 0.6731136363636364 F1 score: 0.6660395556676308 Testing Time: 0.001755280928178267 (+/-) 0.0005166947334056253 Training Time: 2.3952570936896582 (+/-) 0.04970239591934583 === Average network evolution === Total hidden node: 11.688888888888888 (+/-) 1.2077936214845502 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=13, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 13 No. of parameters : 125 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=13, out_features=2, bias=True) ) No. of inputs : 13 No. of output : 2 No. of parameters : 28
100% (45 of 45) |########################| Elapsed Time: 0:01:44 ETA: 00:00:00
=== Performance result === Accuracy: 66.26590909090908 (+/-) 5.985396743203722 Precision: 0.6582034714415427 Recall: 0.6626590909090909 F1 score: 0.6574849250712905 Testing Time: 0.001736527139490301 (+/-) 0.00048301170637693463 Training Time: 2.3815436471592295 (+/-) 0.02205901031729355 === Average network evolution === Total hidden node: 10.466666666666667 (+/-) 1.32664991614216 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 11 No. of parameters : 107 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
100% (45 of 45) |########################| Elapsed Time: 0:01:45 ETA: 00:00:00
=== Performance result === Accuracy: 68.82954545454547 (+/-) 8.335836641913106 Precision: 0.6850497598357418 Recall: 0.6882954545454546 F1 score: 0.6820121868585988 Testing Time: 0.0018252134323120117 (+/-) 0.0005619156629883253 Training Time: 2.391730373555964 (+/-) 0.03150369469761112 === Average network evolution === Total hidden node: 19.022222222222222 (+/-) 1.1448219442125562 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=20, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 20 No. of parameters : 188 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=20, out_features=2, bias=True) ) No. of inputs : 20 No. of output : 2 No. of parameters : 42
========== Performance occupancy ========== Preq Accuracy: 66.82 (+/-) 1.32 F1 score: 0.66 (+/-) 0.01 Precision: 0.66 (+/-) 0.01 Recall: 0.67 (+/-) 0.01 Training time: 2.39 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 13.2 (+/-) 3.71 25% Data
100% (45 of 45) |########################| Elapsed Time: 0:01:28 ETA: 00:00:00
=== Performance result === Accuracy: 65.95681818181818 (+/-) 7.775585266696612 Precision: 0.6573247624234327 Recall: 0.6595681818181818 F1 score: 0.6580740575833111 Testing Time: 0.0018908652392300692 (+/-) 0.00041587454585819274 Training Time: 2.007850105112249 (+/-) 0.014674920771549612 === Average network evolution === Total hidden node: 16.977777777777778 (+/-) 2.4082163882922782 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=19, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 19 No. of parameters : 179 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=19, out_features=2, bias=True) ) No. of inputs : 19 No. of output : 2 No. of parameters : 40
100% (45 of 45) |########################| Elapsed Time: 0:01:28 ETA: 00:00:00
=== Performance result === Accuracy: 64.25454545454544 (+/-) 6.09822830252788 Precision: 0.6363292137671945 Recall: 0.6425454545454545 F1 score: 0.6318161633015588 Testing Time: 0.0016665404493158514 (+/-) 0.0006259663902511844 Training Time: 2.0127531452612444 (+/-) 0.027050431766965182 === Average network evolution === Total hidden node: 7.844444444444444 (+/-) 0.9650689222492651 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 9 No. of parameters : 89 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=2, bias=True) ) No. of inputs : 9 No. of output : 2 No. of parameters : 20
100% (45 of 45) |########################| Elapsed Time: 0:01:28 ETA: 00:00:00
=== Performance result === Accuracy: 64.98636363636363 (+/-) 6.080729281512319 Precision: 0.6453805869807608 Recall: 0.6498636363636363 F1 score: 0.6456399956615226 Testing Time: 0.0019388253038579767 (+/-) 0.0005667536293338981 Training Time: 2.0177710869095544 (+/-) 0.023985740586651327 === Average network evolution === Total hidden node: 13.466666666666667 (+/-) 1.0873004286866728 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 14 No. of parameters : 134 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (45 of 45) |########################| Elapsed Time: 0:01:28 ETA: 00:00:00
=== Performance result === Accuracy: 64.35000000000001 (+/-) 5.597219277137857 Precision: 0.6400026079673325 Recall: 0.6435 F1 score: 0.6408670248425589 Testing Time: 0.0017346956513144753 (+/-) 0.0004837884511792188 Training Time: 2.014276921749115 (+/-) 0.021929229199596736 === Average network evolution === Total hidden node: 15.444444444444445 (+/-) 2.206611837743361 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=17, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 17 No. of parameters : 161 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=17, out_features=2, bias=True) ) No. of inputs : 17 No. of output : 2 No. of parameters : 36
100% (45 of 45) |########################| Elapsed Time: 0:01:28 ETA: 00:00:00
=== Performance result === Accuracy: 65.42045454545456 (+/-) 6.110519228996329 Precision: 0.6495921328977567 Recall: 0.6542045454545454 F1 score: 0.6494342384114868 Testing Time: 0.0017120133746754038 (+/-) 0.0004435141164211631 Training Time: 2.010752921754664 (+/-) 0.027896650958578715 === Average network evolution === Total hidden node: 8.755555555555556 (+/-) 1.5798108966378863 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 10 No. of parameters : 98 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
N/A% (0 of 45) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 64.99 (+/-) 0.64 F1 score: 0.65 (+/-) 0.01 Precision: 0.65 (+/-) 0.01 Recall: 0.65 (+/-) 0.01 Training time: 2.01 (+/-) 0.0 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 13.8 (+/-) 3.87 Infinite Delay
100% (45 of 45) |########################| Elapsed Time: 0:01:15 ETA: 00:00:00
=== Performance result === Accuracy: 57.98863636363637 (+/-) 7.074994889464615 Precision: 0.5570470421955053 Recall: 0.5798863636363636 F1 score: 0.5242768592447575 Testing Time: 0.0017226446758617055 (+/-) 0.0005381746399067189 Training Time: 1.6318173029206016 (+/-) 0.014913415487014097 === Average network evolution === Total hidden node: 4.2 (+/-) 4.2 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=4, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 4 No. of parameters : 44 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=4, out_features=2, bias=True) ) No. of inputs : 4 No. of output : 2 No. of parameters : 10
100% (45 of 45) |########################| Elapsed Time: 0:01:15 ETA: 00:00:00
=== Performance result === Accuracy: 55.78863636363636 (+/-) 5.5428174616546 Precision: 0.5231552622574426 Recall: 0.5578863636363637 F1 score: 0.504581935452061 Testing Time: 0.0016518831253051758 (+/-) 0.0005601404680834668 Training Time: 1.6447911587628452 (+/-) 0.01948767975843442 === Average network evolution === Total hidden node: 6.177777777777778 (+/-) 6.177777777777778 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=6, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 6 No. of parameters : 62 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=6, out_features=2, bias=True) ) No. of inputs : 6 No. of output : 2 No. of parameters : 14
100% (45 of 45) |########################| Elapsed Time: 0:01:16 ETA: 00:00:00
=== Performance result === Accuracy: 58.09090909090909 (+/-) 6.704605546726079 Precision: 0.5643252855732328 Recall: 0.5809090909090909 F1 score: 0.45989854777171846 Testing Time: 0.0016510865905068138 (+/-) 0.000562458128152547 Training Time: 1.656051830811934 (+/-) 0.014737095547373544 === Average network evolution === Total hidden node: 7.622222222222222 (+/-) 7.622222222222222 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=7, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 7 No. of parameters : 71 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=7, out_features=2, bias=True) ) No. of inputs : 7 No. of output : 2 No. of parameters : 16
100% (45 of 45) |########################| Elapsed Time: 0:01:15 ETA: 00:00:00
=== Performance result === Accuracy: 61.63409090909091 (+/-) 6.221916658138909 Precision: 0.612948157781875 Recall: 0.6163409090909091 F1 score: 0.5738345741524136 Testing Time: 0.001754441044547341 (+/-) 0.0006354518444211762 Training Time: 1.6443087458610535 (+/-) 0.018432656689547078 === Average network evolution === Total hidden node: 13.955555555555556 (+/-) 13.955555555555556 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 14 No. of parameters : 134 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (45 of 45) |########################| Elapsed Time: 0:01:18 ETA: 00:00:00
=== Performance result === Accuracy: 61.57272727272728 (+/-) 6.189704194590314 Precision: 0.6062952624377834 Recall: 0.6157272727272727 F1 score: 0.5987134434980798 Testing Time: 0.0017207915132695978 (+/-) 0.000495171145331022 Training Time: 1.7014540379697627 (+/-) 0.07965185737186324 === Average network evolution === Total hidden node: 3.888888888888889 (+/-) 3.888888888888889 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=4, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 4 No. of parameters : 44 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=4, out_features=2, bias=True) ) No. of inputs : 4 No. of output : 2 No. of parameters : 10
========== Performance occupancy ========== Preq Accuracy: 59.02 (+/-) 2.27 F1 score: 0.53 (+/-) 0.05 Precision: 0.57 (+/-) 0.03 Recall: 0.59 (+/-) 0.02 Training time: 1.66 (+/-) 0.02 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 7.0 (+/-) 3.69
%run DEVDAN_rmnist.ipynb
Number of input: 784 Number of output: 10 Number of batch: 69 All Data
100% (69 of 69) |########################| Elapsed Time: 0:05:21 ETA: 00:00:00
=== Performance result === Accuracy: 91.20735294117647 (+/-) 4.281468830439008 Precision: 0.9121754367545558 Recall: 0.9120735294117647 F1 score: 0.912061606808019 Testing Time: 0.01707222531823551 (+/-) 0.0019145821885862737 Training Time: 4.7024281059994415 (+/-) 0.08675534703453633 === Average network evolution === Total hidden node: 58.391304347826086 (+/-) 2.456682051970472 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=62, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 62 No. of parameters : 49454 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=62, out_features=10, bias=True) ) No. of inputs : 62 No. of output : 10 No. of parameters : 630
100% (69 of 69) |########################| Elapsed Time: 0:05:19 ETA: 00:00:00
=== Performance result === Accuracy: 91.40588235294115 (+/-) 3.9306818130663967 Precision: 0.9140466919257819 Recall: 0.9140588235294118 F1 score: 0.9139909778905166 Testing Time: 0.01687918691074147 (+/-) 0.0026816331184010274 Training Time: 4.675958072437959 (+/-) 0.06743711113975903 === Average network evolution === Total hidden node: 58.0 (+/-) 1.841549442134554 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=61, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 61 No. of parameters : 48669 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=61, out_features=10, bias=True) ) No. of inputs : 61 No. of output : 10 No. of parameters : 620
100% (69 of 69) |########################| Elapsed Time: 0:05:20 ETA: 00:00:00
=== Performance result === Accuracy: 91.09264705882353 (+/-) 4.123544883330881 Precision: 0.9112019231697668 Recall: 0.9109264705882353 F1 score: 0.9109815804650563 Testing Time: 0.017588159617255714 (+/-) 0.003171387856496166 Training Time: 4.693253387423122 (+/-) 0.06415027373242652 === Average network evolution === Total hidden node: 59.63768115942029 (+/-) 1.9409471353951546 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=60, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 60 No. of parameters : 47884 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=60, out_features=10, bias=True) ) No. of inputs : 60 No. of output : 10 No. of parameters : 610
100% (69 of 69) |########################| Elapsed Time: 0:05:25 ETA: 00:00:00
=== Performance result === Accuracy: 91.00294117647057 (+/-) 4.596545076859207 Precision: 0.909878369100459 Recall: 0.9100294117647059 F1 score: 0.9099125104754633 Testing Time: 0.017782393623800838 (+/-) 0.00314970831055761 Training Time: 4.761308266836054 (+/-) 0.12095370393763206 === Average network evolution === Total hidden node: 65.27536231884058 (+/-) 5.975515399445882 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=66, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 66 No. of parameters : 52594 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=66, out_features=10, bias=True) ) No. of inputs : 66 No. of output : 10 No. of parameters : 670
100% (69 of 69) |########################| Elapsed Time: 0:05:19 ETA: 00:00:00
=== Performance result === Accuracy: 90.78676470588235 (+/-) 3.599587574895569 Precision: 0.9076948693931182 Recall: 0.9078676470588235 F1 score: 0.9077291183849773 Testing Time: 0.017133681213154513 (+/-) 0.0029130128240678083 Training Time: 4.676245226579554 (+/-) 0.11896093598016656 === Average network evolution === Total hidden node: 53.91304347826087 (+/-) 0.44196958498483796 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=54, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 54 No. of parameters : 43174 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=54, out_features=10, bias=True) ) No. of inputs : 54 No. of output : 10 No. of parameters : 550
========== Performance occupancy ========== Preq Accuracy: 91.1 (+/-) 0.21 F1 score: 0.91 (+/-) 0.0 Precision: 0.91 (+/-) 0.0 Recall: 0.91 (+/-) 0.0 Training time: 4.7 (+/-) 0.03 Testing time: 0.02 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 60.6 (+/-) 3.88 50% Data
100% (69 of 69) |########################| Elapsed Time: 0:04:04 ETA: 00:00:00
=== Performance result === Accuracy: 89.48529411764704 (+/-) 4.54517406338151 Precision: 0.8946411095399074 Recall: 0.8948529411764706 F1 score: 0.8947092649512606 Testing Time: 0.017579155809739056 (+/-) 0.0035187127681345453 Training Time: 3.578778663102318 (+/-) 0.07623474988383902 === Average network evolution === Total hidden node: 52.95652173913044 (+/-) 0.7505511522448727 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=54, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 54 No. of parameters : 43174 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=54, out_features=10, bias=True) ) No. of inputs : 54 No. of output : 10 No. of parameters : 550
100% (69 of 69) |########################| Elapsed Time: 0:04:11 ETA: 00:00:00
=== Performance result === Accuracy: 88.83970588235294 (+/-) 5.733701702008264 Precision: 0.8890956008665238 Recall: 0.8883970588235294 F1 score: 0.8884828130318017 Testing Time: 0.018240879563724294 (+/-) 0.002735365512110943 Training Time: 3.6799114872427547 (+/-) 0.1631239712216963 === Average network evolution === Total hidden node: 63.69565217391305 (+/-) 1.6794093839850406 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=64, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 64 No. of parameters : 51024 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=64, out_features=10, bias=True) ) No. of inputs : 64 No. of output : 10 No. of parameters : 650
100% (69 of 69) |########################| Elapsed Time: 0:04:08 ETA: 00:00:00
=== Performance result === Accuracy: 88.00147058823529 (+/-) 6.045185071050563 Precision: 0.880101699909268 Recall: 0.880014705882353 F1 score: 0.8800184805569037 Testing Time: 0.017962894018958595 (+/-) 0.003170714975085655 Training Time: 3.6357354206197403 (+/-) 0.13321100776101344 === Average network evolution === Total hidden node: 72.17391304347827 (+/-) 10.076456303910243 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=74, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 74 No. of parameters : 58874 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=74, out_features=10, bias=True) ) No. of inputs : 74 No. of output : 10 No. of parameters : 750
100% (69 of 69) |########################| Elapsed Time: 0:04:03 ETA: 00:00:00
=== Performance result === Accuracy: 88.9779411764706 (+/-) 6.399720710547482 Precision: 0.8899817079802047 Recall: 0.8897794117647059 F1 score: 0.8896890426510843 Testing Time: 0.017855952767764822 (+/-) 0.003864937217027266 Training Time: 3.5657929918345284 (+/-) 0.08910307577283723 === Average network evolution === Total hidden node: 53.05797101449275 (+/-) 1.8406367673456832 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=54, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 54 No. of parameters : 43174 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=54, out_features=10, bias=True) ) No. of inputs : 54 No. of output : 10 No. of parameters : 550
100% (69 of 69) |########################| Elapsed Time: 0:04:06 ETA: 00:00:00
=== Performance result === Accuracy: 88.98235294117647 (+/-) 5.538287619633479 Precision: 0.889639125251107 Recall: 0.8898235294117647 F1 score: 0.8897070553381168 Testing Time: 0.017487049102783203 (+/-) 0.0023208450811236124 Training Time: 3.6025286316871643 (+/-) 0.0515585287315136 === Average network evolution === Total hidden node: 60.84057971014493 (+/-) 0.926629992066422 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=61, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 61 No. of parameters : 48669 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=61, out_features=10, bias=True) ) No. of inputs : 61 No. of output : 10 No. of parameters : 620
========== Performance occupancy ========== Preq Accuracy: 88.86 (+/-) 0.48 F1 score: 0.89 (+/-) 0.0 Precision: 0.89 (+/-) 0.0 Recall: 0.89 (+/-) 0.0 Training time: 3.61 (+/-) 0.04 Testing time: 0.02 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 61.4 (+/-) 7.42 25% Data
100% (69 of 69) |########################| Elapsed Time: 0:03:29 ETA: 00:00:00
=== Performance result === Accuracy: 85.60294117647057 (+/-) 8.255568006756825 Precision: 0.8564512931821048 Recall: 0.8560294117647059 F1 score: 0.8560578097393996 Testing Time: 0.018413403454948876 (+/-) 0.003726084408041066 Training Time: 3.0616672424709095 (+/-) 0.0711913362567278 === Average network evolution === Total hidden node: 56.2463768115942 (+/-) 0.907389903912958 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=56, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 56 No. of parameters : 44744 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=56, out_features=10, bias=True) ) No. of inputs : 56 No. of output : 10 No. of parameters : 570
100% (69 of 69) |########################| Elapsed Time: 0:03:28 ETA: 00:00:00
=== Performance result === Accuracy: 86.3720588235294 (+/-) 6.8781804533961575 Precision: 0.8635549034718024 Recall: 0.8637205882352941 F1 score: 0.8635194394325155 Testing Time: 0.017909737194285673 (+/-) 0.003601132254286814 Training Time: 3.0407121391857372 (+/-) 0.05191835477606462 === Average network evolution === Total hidden node: 54.94202898550725 (+/-) 0.4780412319556707 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=55, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 55 No. of parameters : 43959 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=55, out_features=10, bias=True) ) No. of inputs : 55 No. of output : 10 No. of parameters : 560
100% (69 of 69) |########################| Elapsed Time: 0:03:31 ETA: 00:00:00
=== Performance result === Accuracy: 86.19558823529412 (+/-) 8.045448728456014 Precision: 0.8617422123635735 Recall: 0.8619558823529412 F1 score: 0.8616814522653437 Testing Time: 0.01930155824212467 (+/-) 0.003497060003830735 Training Time: 3.0850580404786503 (+/-) 0.16363298363406073 === Average network evolution === Total hidden node: 78.71014492753623 (+/-) 7.5892781907936975 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=81, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 81 No. of parameters : 64369 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=81, out_features=10, bias=True) ) No. of inputs : 81 No. of output : 10 No. of parameters : 820
100% (69 of 69) |########################| Elapsed Time: 0:03:25 ETA: 00:00:00
=== Performance result === Accuracy: 86.1029411764706 (+/-) 7.47424933449419 Precision: 0.8613072341758994 Recall: 0.8610294117647059 F1 score: 0.8610755767964675 Testing Time: 0.018737442353192496 (+/-) 0.0034206276362946032 Training Time: 3.0025552476153656 (+/-) 0.18770818497636385 === Average network evolution === Total hidden node: 65.3913043478261 (+/-) 3.1031418921835616 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=66, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 66 No. of parameters : 52594 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=66, out_features=10, bias=True) ) No. of inputs : 66 No. of output : 10 No. of parameters : 670
100% (69 of 69) |########################| Elapsed Time: 0:03:17 ETA: 00:00:00
=== Performance result === Accuracy: 86.6735294117647 (+/-) 6.996797957701769 Precision: 0.86660873487005 Recall: 0.866735294117647 F1 score: 0.8666113381236042 Testing Time: 0.018836785765255198 (+/-) 0.0034862526977909947 Training Time: 2.878607988357544 (+/-) 0.0963126107930057 === Average network evolution === Total hidden node: 54.98550724637681 (+/-) 0.11951030798891768 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=55, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 55 No. of parameters : 43959 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=55, out_features=10, bias=True) ) No. of inputs : 55 No. of output : 10 No. of parameters : 560
N/A% (0 of 69) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 86.19 (+/-) 0.35 F1 score: 0.86 (+/-) 0.0 Precision: 0.86 (+/-) 0.0 Recall: 0.86 (+/-) 0.0 Training time: 3.01 (+/-) 0.07 Testing time: 0.02 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 62.6 (+/-) 10.09 Infinite Delay
100% (69 of 69) |########################| Elapsed Time: 0:02:53 ETA: 00:00:00
=== Performance result === Accuracy: 40.35147058823529 (+/-) 5.984796734644987 Precision: 0.5583164249563346 Recall: 0.40351470588235294 F1 score: 0.39279515775170626 Testing Time: 0.02642878714729758 (+/-) 0.0035416090104053 Training Time: 2.457667087807375 (+/-) 0.11507105679705569 === Average network evolution === Total hidden node: 45.10144927536232 (+/-) 45.10144927536232 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=46, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 46 No. of parameters : 36894 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=46, out_features=10, bias=True) ) No. of inputs : 46 No. of output : 10 No. of parameters : 470
100% (69 of 69) |########################| Elapsed Time: 0:02:33 ETA: 00:00:00
=== Performance result === Accuracy: 11.911764705882355 (+/-) 8.45301076560655 Precision: 0.6396257782256521 Recall: 0.11911764705882352 F1 score: 0.05044704152972008 Testing Time: 0.0464637419756721 (+/-) 0.010620419820038704 Training Time: 2.158467313822578 (+/-) 0.12781023236265102 === Average network evolution === Total hidden node: 28.014492753623188 (+/-) 28.014492753623188 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=29, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 29 No. of parameters : 23549 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=29, out_features=10, bias=True) ) No. of inputs : 29 No. of output : 10 No. of parameters : 300
100% (69 of 69) |########################| Elapsed Time: 0:03:00 ETA: 00:00:00
=== Performance result === Accuracy: 32.555882352941175 (+/-) 5.150396396878855 Precision: 0.5843715504291254 Recall: 0.3255588235294118 F1 score: 0.26665577384207656 Testing Time: 0.030684032860924217 (+/-) 0.003343341850675314 Training Time: 2.56908957046621 (+/-) 0.12418446109503803 === Average network evolution === Total hidden node: 48.0 (+/-) 48.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=48, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 48 No. of parameters : 38464 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=48, out_features=10, bias=True) ) No. of inputs : 48 No. of output : 10 No. of parameters : 490
100% (69 of 69) |########################| Elapsed Time: 0:03:03 ETA: 00:00:00
=== Performance result === Accuracy: 20.17941176470588 (+/-) 7.342614513187613 Precision: 0.629739736102647 Recall: 0.20179411764705882 F1 score: 0.1743979006296883 Testing Time: 0.03161644584992353 (+/-) 0.0033637960830123037 Training Time: 2.6099616920246795 (+/-) 0.11351286571923842 === Average network evolution === Total hidden node: 48.88405797101449 (+/-) 48.88405797101449 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=49, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 49 No. of parameters : 39249 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=49, out_features=10, bias=True) ) No. of inputs : 49 No. of output : 10 No. of parameters : 500
100% (69 of 69) |########################| Elapsed Time: 0:03:14 ETA: 00:00:00
=== Performance result === Accuracy: 20.755882352941175 (+/-) 7.301075310667706 Precision: 0.6021139882705219 Recall: 0.20755882352941177 F1 score: 0.0978856819264646 Testing Time: 0.042128647074979896 (+/-) 0.005231481572402563 Training Time: 2.755272826727699 (+/-) 0.12319388052125792 === Average network evolution === Total hidden node: 49.17391304347826 (+/-) 49.17391304347826 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=49, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 49 No. of parameters : 39249 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=49, out_features=10, bias=True) ) No. of inputs : 49 No. of output : 10 No. of parameters : 500
========== Performance occupancy ========== Preq Accuracy: 25.15 (+/-) 10.05 F1 score: 0.2 (+/-) 0.12 Precision: 0.6 (+/-) 0.03 Recall: 0.25 (+/-) 0.1 Training time: 2.51 (+/-) 0.2 Testing time: 0.04 (+/-) 0.01 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 44.2 (+/-) 7.68
%run DEVDAN_pmnist.ipynb
Number of input: 784 Number of output: 10 Number of batch: 69 All Data
100% (69 of 69) |########################| Elapsed Time: 0:05:11 ETA: 00:00:00
=== Performance result === Accuracy: 84.46323529411768 (+/-) 14.220468930091421 Precision: 0.8477063888866383 Recall: 0.8446323529411764 F1 score: 0.8455863983439038 Testing Time: 0.019519462304956773 (+/-) 0.00653963938449298 Training Time: 4.56293657947989 (+/-) 0.20790287299562826 === Average network evolution === Total hidden node: 66.84057971014492 (+/-) 2.872272182440777 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=68, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 68 No. of parameters : 54164 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=68, out_features=10, bias=True) ) No. of inputs : 68 No. of output : 10 No. of parameters : 690
100% (69 of 69) |########################| Elapsed Time: 0:05:06 ETA: 00:00:00
=== Performance result === Accuracy: 83.64264705882354 (+/-) 15.483096249254146 Precision: 0.8372730319293231 Recall: 0.8364264705882353 F1 score: 0.8364451665917312 Testing Time: 0.019030735773198745 (+/-) 0.005757246363298547 Training Time: 4.480970561504364 (+/-) 0.15852254866187857 === Average network evolution === Total hidden node: 67.42028985507247 (+/-) 3.047478392649437 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=68, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 68 No. of parameters : 54164 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=68, out_features=10, bias=True) ) No. of inputs : 68 No. of output : 10 No. of parameters : 690
100% (69 of 69) |########################| Elapsed Time: 0:05:01 ETA: 00:00:00
=== Performance result === Accuracy: 83.73676470588235 (+/-) 15.780040486898825 Precision: 0.839320596369045 Recall: 0.8373676470588235 F1 score: 0.8378000633962865 Testing Time: 0.019588098806493422 (+/-) 0.006158839898437832 Training Time: 4.418669662054847 (+/-) 0.09511035752293326 === Average network evolution === Total hidden node: 75.91304347826087 (+/-) 4.744910715046365 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=77, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 77 No. of parameters : 61229 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=77, out_features=10, bias=True) ) No. of inputs : 77 No. of output : 10 No. of parameters : 780
100% (69 of 69) |########################| Elapsed Time: 0:05:31 ETA: 00:00:00
=== Performance result === Accuracy: 82.98970588235294 (+/-) 15.68634793947274 Precision: 0.8315275886908015 Recall: 0.8298970588235294 F1 score: 0.8300294273977369 Testing Time: 0.02139613207648782 (+/-) 0.009149734297194515 Training Time: 4.850568298031302 (+/-) 0.7032074593035993 === Average network evolution === Total hidden node: 70.81159420289855 (+/-) 5.111156771756629 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=72, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 72 No. of parameters : 57304 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=72, out_features=10, bias=True) ) No. of inputs : 72 No. of output : 10 No. of parameters : 730
100% (69 of 69) |########################| Elapsed Time: 0:06:00 ETA: 00:00:00
=== Performance result === Accuracy: 83.70294117647059 (+/-) 13.765344065145408 Precision: 0.8380800246294042 Recall: 0.8370294117647059 F1 score: 0.8368702258128398 Testing Time: 0.022298511336831486 (+/-) 0.008514839029296473 Training Time: 5.272772406830507 (+/-) 0.6826148103359203 === Average network evolution === Total hidden node: 71.56521739130434 (+/-) 2.410594134892019 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=72, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 72 No. of parameters : 57304 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=72, out_features=10, bias=True) ) No. of inputs : 72 No. of output : 10 No. of parameters : 730
========== Performance occupancy ========== Preq Accuracy: 83.71 (+/-) 0.47 F1 score: 0.84 (+/-) 0.0 Precision: 0.84 (+/-) 0.01 Recall: 0.84 (+/-) 0.0 Training time: 4.72 (+/-) 0.31 Testing time: 0.02 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 71.4 (+/-) 3.32 50% Data
100% (69 of 69) |########################| Elapsed Time: 0:03:57 ETA: 00:00:00
=== Performance result === Accuracy: 80.64411764705882 (+/-) 16.311204939378158 Precision: 0.8088740858878152 Recall: 0.8064411764705882 F1 score: 0.8071685661665067 Testing Time: 0.019560855977675495 (+/-) 0.005556032377926591 Training Time: 3.4651774483568527 (+/-) 0.1155630773483468 === Average network evolution === Total hidden node: 72.05797101449275 (+/-) 3.9485240609662506 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=73, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 73 No. of parameters : 58089 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=73, out_features=10, bias=True) ) No. of inputs : 73 No. of output : 10 No. of parameters : 740
100% (69 of 69) |########################| Elapsed Time: 0:03:50 ETA: 00:00:00
=== Performance result === Accuracy: 79.90588235294116 (+/-) 15.704737786010172 Precision: 0.8007929928271266 Recall: 0.7990588235294117 F1 score: 0.799310424819239 Testing Time: 0.018970875179066378 (+/-) 0.005843724202964007 Training Time: 3.3753324571777794 (+/-) 0.18148726980565028 === Average network evolution === Total hidden node: 58.44927536231884 (+/-) 2.8919490037200903 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=59, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 59 No. of parameters : 47099 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=59, out_features=10, bias=True) ) No. of inputs : 59 No. of output : 10 No. of parameters : 600
100% (69 of 69) |########################| Elapsed Time: 0:03:53 ETA: 00:00:00
=== Performance result === Accuracy: 80.61617647058823 (+/-) 16.79972522791858 Precision: 0.8071697565957654 Recall: 0.8061617647058823 F1 score: 0.8063066282018548 Testing Time: 0.019190066000994516 (+/-) 0.00726453453979381 Training Time: 3.4093506897197052 (+/-) 0.16147735140998654 === Average network evolution === Total hidden node: 67.15942028985508 (+/-) 3.697328406443708 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=68, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 68 No. of parameters : 54164 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=68, out_features=10, bias=True) ) No. of inputs : 68 No. of output : 10 No. of parameters : 690
100% (69 of 69) |########################| Elapsed Time: 0:03:50 ETA: 00:00:00
=== Performance result === Accuracy: 81.42647058823529 (+/-) 15.070467981795106 Precision: 0.8179819748630814 Recall: 0.8142647058823529 F1 score: 0.8150378134510861 Testing Time: 0.01902237709830789 (+/-) 0.006990126299196121 Training Time: 3.3746082011391136 (+/-) 0.20019087998340945 === Average network evolution === Total hidden node: 67.18840579710145 (+/-) 4.246995058124569 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=68, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 68 No. of parameters : 54164 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=68, out_features=10, bias=True) ) No. of inputs : 68 No. of output : 10 No. of parameters : 690
100% (69 of 69) |########################| Elapsed Time: 0:03:49 ETA: 00:00:00
=== Performance result === Accuracy: 80.25441176470588 (+/-) 16.64575934181868 Precision: 0.8049818285180451 Recall: 0.8025441176470588 F1 score: 0.8033938721295196 Testing Time: 0.019356699550853056 (+/-) 0.00632902866724976 Training Time: 3.359912907376009 (+/-) 0.07699155493447761 === Average network evolution === Total hidden node: 69.95652173913044 (+/-) 4.604455254593793 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=71, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 71 No. of parameters : 56519 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=71, out_features=10, bias=True) ) No. of inputs : 71 No. of output : 10 No. of parameters : 720
========== Performance occupancy ========== Preq Accuracy: 80.57 (+/-) 0.51 F1 score: 0.81 (+/-) 0.01 Precision: 0.81 (+/-) 0.01 Recall: 0.81 (+/-) 0.01 Training time: 3.4 (+/-) 0.04 Testing time: 0.02 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 67.8 (+/-) 4.79 25% Data
100% (69 of 69) |########################| Elapsed Time: 0:03:18 ETA: 00:00:00
=== Performance result === Accuracy: 76.26323529411764 (+/-) 18.35365730833219 Precision: 0.7632599069907098 Recall: 0.7626323529411765 F1 score: 0.7622666288465874 Testing Time: 0.020237466868232277 (+/-) 0.007883924198523126 Training Time: 2.892399111214806 (+/-) 0.07849819951815269 === Average network evolution === Total hidden node: 75.34782608695652 (+/-) 3.020826679044449 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=76, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 76 No. of parameters : 60444 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=76, out_features=10, bias=True) ) No. of inputs : 76 No. of output : 10 No. of parameters : 770
100% (69 of 69) |########################| Elapsed Time: 0:03:35 ETA: 00:00:00
=== Performance result === Accuracy: 76.75882352941177 (+/-) 17.991470489668785 Precision: 0.7719089970854155 Recall: 0.7675882352941177 F1 score: 0.7681206167185662 Testing Time: 0.020353804616367117 (+/-) 0.006372210879471908 Training Time: 3.14686339041766 (+/-) 0.1572943608352314 === Average network evolution === Total hidden node: 105.57971014492753 (+/-) 16.192148568930026 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=109, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 109 No. of parameters : 86349 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=109, out_features=10, bias=True) ) No. of inputs : 109 No. of output : 10 No. of parameters : 1100
100% (69 of 69) |########################| Elapsed Time: 0:03:36 ETA: 00:00:00
=== Performance result === Accuracy: 72.54705882352943 (+/-) 20.609017662094903 Precision: 0.7266106361396942 Recall: 0.7254705882352941 F1 score: 0.7258523036769005 Testing Time: 0.022645764491137338 (+/-) 0.008644239474507506 Training Time: 3.1532852509442497 (+/-) 0.32391587090899954 === Average network evolution === Total hidden node: 108.52173913043478 (+/-) 28.071135670738194 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=118, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 118 No. of parameters : 93414 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=118, out_features=10, bias=True) ) No. of inputs : 118 No. of output : 10 No. of parameters : 1190
100% (69 of 69) |########################| Elapsed Time: 0:03:20 ETA: 00:00:00
=== Performance result === Accuracy: 75.93529411764706 (+/-) 17.723611409193193 Precision: 0.7608392682886446 Recall: 0.7593529411764706 F1 score: 0.7594134027197383 Testing Time: 0.020824078251333797 (+/-) 0.007195452501504053 Training Time: 2.9196199459188126 (+/-) 0.09006584413496722 === Average network evolution === Total hidden node: 76.01449275362319 (+/-) 3.661364098466811 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=77, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 77 No. of parameters : 61229 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=77, out_features=10, bias=True) ) No. of inputs : 77 No. of output : 10 No. of parameters : 780
100% (69 of 69) |########################| Elapsed Time: 0:03:18 ETA: 00:00:00
=== Performance result === Accuracy: 76.74264705882354 (+/-) 18.30353000489628 Precision: 0.7718339088999195 Recall: 0.7674264705882353 F1 score: 0.7687114972534361 Testing Time: 0.020540973719428566 (+/-) 0.006068717508294874 Training Time: 2.9016337640145244 (+/-) 0.0864527580733426 === Average network evolution === Total hidden node: 72.1159420289855 (+/-) 3.1325140705887433 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=73, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 73 No. of parameters : 58089 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=73, out_features=10, bias=True) ) No. of inputs : 73 No. of output : 10 No. of parameters : 740
N/A% (0 of 69) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 75.65 (+/-) 1.58 F1 score: 0.76 (+/-) 0.02 Precision: 0.76 (+/-) 0.02 Recall: 0.76 (+/-) 0.02 Training time: 3.0 (+/-) 0.12 Testing time: 0.02 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 90.6 (+/-) 18.96 Infinite Delay
100% (69 of 69) |########################| Elapsed Time: 0:03:10 ETA: 00:00:00
=== Performance result === Accuracy: 14.794117647058824 (+/-) 12.057288281037387 Precision: 0.4923295915143977 Recall: 0.14794117647058824 F1 score: 0.11077296634098843 Testing Time: 0.04539057787726907 (+/-) 0.009691218082167937 Training Time: 2.6980996552635643 (+/-) 0.10094805593908691 === Average network evolution === Total hidden node: 48.666666666666664 (+/-) 48.666666666666664 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=49, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 49 No. of parameters : 39249 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=49, out_features=10, bias=True) ) No. of inputs : 49 No. of output : 10 No. of parameters : 500
100% (69 of 69) |########################| Elapsed Time: 0:04:18 ETA: 00:00:00
=== Performance result === Accuracy: 13.144117647058824 (+/-) 5.901081215808755 Precision: 0.57568057093761 Recall: 0.13144117647058823 F1 score: 0.08903377247400275 Testing Time: 0.05407240811516257 (+/-) 0.01164466316740226 Training Time: 3.6930234327035794 (+/-) 0.15151468951744565 === Average network evolution === Total hidden node: 39.0 (+/-) 39.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=39, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 39 No. of parameters : 31399 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=39, out_features=10, bias=True) ) No. of inputs : 39 No. of output : 10 No. of parameters : 400
100% (69 of 69) |########################| Elapsed Time: 0:03:05 ETA: 00:00:00
=== Performance result === Accuracy: 12.735294117647058 (+/-) 3.3306404924526753 Precision: 0.5529513464004769 Recall: 0.12735294117647059 F1 score: 0.07498186872757526 Testing Time: 0.04363921810598934 (+/-) 0.007891193042713018 Training Time: 2.6308416689143463 (+/-) 0.10570466941161397 === Average network evolution === Total hidden node: 45.98550724637681 (+/-) 45.98550724637681 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=46, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 46 No. of parameters : 36894 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=46, out_features=10, bias=True) ) No. of inputs : 46 No. of output : 10 No. of parameters : 470
100% (69 of 69) |########################| Elapsed Time: 0:03:08 ETA: 00:00:00
=== Performance result === Accuracy: 16.966176470588238 (+/-) 12.592827210175193 Precision: 0.5568840085846568 Recall: 0.16966176470588235 F1 score: 0.15442680140180015 Testing Time: 0.04555094242095947 (+/-) 0.009359089565275739 Training Time: 2.6623107089715847 (+/-) 0.10166135441199978 === Average network evolution === Total hidden node: 51.98550724637681 (+/-) 51.98550724637681 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=52, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 52 No. of parameters : 41604 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=52, out_features=10, bias=True) ) No. of inputs : 52 No. of output : 10 No. of parameters : 530
100% (69 of 69) |########################| Elapsed Time: 0:03:09 ETA: 00:00:00
=== Performance result === Accuracy: 14.688235294117648 (+/-) 8.618559477897014 Precision: 0.5567975564179721 Recall: 0.14688235294117646 F1 score: 0.10612780652863711 Testing Time: 0.04555041649762322 (+/-) 0.0073884780938668 Training Time: 2.6771987711682037 (+/-) 0.10645972604388527 === Average network evolution === Total hidden node: 53.98550724637681 (+/-) 53.98550724637681 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=54, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 54 No. of parameters : 43174 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=54, out_features=10, bias=True) ) No. of inputs : 54 No. of output : 10 No. of parameters : 550
========== Performance occupancy ========== Preq Accuracy: 14.47 (+/-) 1.49 F1 score: 0.11 (+/-) 0.03 Precision: 0.55 (+/-) 0.03 Recall: 0.14 (+/-) 0.01 Training time: 2.87 (+/-) 0.41 Testing time: 0.05 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 48.0 (+/-) 5.25
%run DEVDAN_hepmass.ipynb
Number of input: 28 Number of output: 2 Number of batch: 2000 All Data
100% (2000 of 2000) |####################| Elapsed Time: 1:31:46 ETA: 00:00:00
=== Performance result === Accuracy: 83.94497248624312 (+/-) 1.617551149919122 Precision: 0.840275600584852 Recall: 0.8394497248624312 F1 score: 0.8393514266670782 Testing Time: 0.002415422560752422 (+/-) 0.0005241059887008398 Training Time: 2.7385008901640915 (+/-) 0.05825420715262009 === Average network evolution === Total hidden node: 14.9825 (+/-) 0.25533066795823794 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 15 No. of parameters : 463 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
100% (2000 of 2000) |####################| Elapsed Time: 1:31:49 ETA: 00:00:00
=== Performance result === Accuracy: 83.98689344672336 (+/-) 1.6790318275992033 Precision: 0.8406943545111044 Recall: 0.8398689344672337 F1 score: 0.8397710674470118 Testing Time: 0.0024690111617316837 (+/-) 0.0006760279173701602 Training Time: 2.740053920879431 (+/-) 0.056411695847366994 === Average network evolution === Total hidden node: 16.9425 (+/-) 0.5110711789956465 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=17, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 17 No. of parameters : 521 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=17, out_features=2, bias=True) ) No. of inputs : 17 No. of output : 2 No. of parameters : 36
100% (2000 of 2000) |####################| Elapsed Time: 1:31:52 ETA: 00:00:00
=== Performance result === Accuracy: 83.92516258129066 (+/-) 1.462884881721431 Precision: 0.8414218582324984 Recall: 0.8392516258129065 F1 score: 0.8389944414168139 Testing Time: 0.002476679199394314 (+/-) 0.0005371772400377058 Training Time: 2.741735331829695 (+/-) 0.04712511541327894 === Average network evolution === Total hidden node: 21.9905 (+/-) 0.38263527019865823 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=22, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 22 No. of parameters : 666 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=22, out_features=2, bias=True) ) No. of inputs : 22 No. of output : 2 No. of parameters : 46
100% (2000 of 2000) |####################| Elapsed Time: 1:32:03 ETA: 00:00:00
=== Performance result === Accuracy: 83.94917458729365 (+/-) 1.6823772661526757 Precision: 0.8408720748654416 Recall: 0.8394917458729365 F1 score: 0.8393280297896675 Testing Time: 0.002429525872479086 (+/-) 0.000524417335889239 Training Time: 2.7472197538378715 (+/-) 0.06656484856197609 === Average network evolution === Total hidden node: 15.9275 (+/-) 0.5091598471992858 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=16, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 16 No. of parameters : 492 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=16, out_features=2, bias=True) ) No. of inputs : 16 No. of output : 2 No. of parameters : 34
100% (2000 of 2000) |####################| Elapsed Time: 1:34:21 ETA: 00:00:00
=== Performance result === Accuracy: 84.05227613806903 (+/-) 1.6477199587056814 Precision: 0.8413378563985309 Recall: 0.8405227613806904 F1 score: 0.8404266905200606 Testing Time: 0.0024295195512261136 (+/-) 0.0005946276599019067 Training Time: 2.8159550720003024 (+/-) 0.28433889304271576 === Average network evolution === Total hidden node: 10.8625 (+/-) 0.5259218097778413 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 11 No. of parameters : 347 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
========== Performance occupancy ========== Preq Accuracy: 83.97 (+/-) 0.04 F1 score: 0.84 (+/-) 0.0 Precision: 0.84 (+/-) 0.0 Recall: 0.84 (+/-) 0.0 Training time: 2.76 (+/-) 0.03 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 16.2 (+/-) 3.54 50% Data
100% (2000 of 2000) |####################| Elapsed Time: 1:25:22 ETA: 00:00:00
=== Performance result === Accuracy: 83.20290145072536 (+/-) 1.5333679253167165 Precision: 0.8347911104092015 Recall: 0.8320290145072536 F1 score: 0.8316802463229234 Testing Time: 0.0027140078990682474 (+/-) 0.0006646677541460836 Training Time: 2.545525513034036 (+/-) 0.33814500561862465 === Average network evolution === Total hidden node: 10.974 (+/-) 0.3454330615329111 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 11 No. of parameters : 347 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
100% (2000 of 2000) |####################| Elapsed Time: 1:20:43 ETA: 00:00:00
=== Performance result === Accuracy: 83.40575287643823 (+/-) 1.800889647569179 Precision: 0.8360561727896367 Recall: 0.8340575287643822 F1 score: 0.8338090908765071 Testing Time: 0.002615449308096736 (+/-) 0.0006026017746485783 Training Time: 2.4071752180630948 (+/-) 0.03408486555618928 === Average network evolution === Total hidden node: 13.6075 (+/-) 0.7067133435842285 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 14 No. of parameters : 434 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (2000 of 2000) |####################| Elapsed Time: 1:20:55 ETA: 00:00:00
=== Performance result === Accuracy: 83.47633816908454 (+/-) 1.6976561536149783 Precision: 0.8361477169766328 Recall: 0.8347633816908454 F1 score: 0.834591975112313 Testing Time: 0.002615268019332237 (+/-) 0.0005961787670204172 Training Time: 2.412870115372704 (+/-) 0.04198359680753023 === Average network evolution === Total hidden node: 11.779 (+/-) 0.5675905214148664 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 12 No. of parameters : 376 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
100% (2000 of 2000) |####################| Elapsed Time: 1:21:10 ETA: 00:00:00
=== Performance result === Accuracy: 83.15852926463232 (+/-) 1.9373390718608576 Precision: 0.8341168659704457 Recall: 0.8315852926463232 F1 score: 0.8312641293250608 Testing Time: 0.002643382924982999 (+/-) 0.0006016567706170252 Training Time: 2.420537390668372 (+/-) 0.05151565320175395 === Average network evolution === Total hidden node: 14.91 (+/-) 0.6526101439603894 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 15 No. of parameters : 463 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
100% (2000 of 2000) |####################| Elapsed Time: 1:20:49 ETA: 00:00:00
=== Performance result === Accuracy: 83.63126563281641 (+/-) 1.7679255440598047 Precision: 0.8376653272971607 Recall: 0.8363126563281641 F1 score: 0.836147479116078 Testing Time: 0.002614494679628938 (+/-) 0.0006374683777889488 Training Time: 2.410004387025895 (+/-) 0.037536546171158196 === Average network evolution === Total hidden node: 11.983 (+/-) 0.20666639784928748 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 12 No. of parameters : 376 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
========== Performance occupancy ========== Preq Accuracy: 83.37 (+/-) 0.18 F1 score: 0.83 (+/-) 0.0 Precision: 0.84 (+/-) 0.0 Recall: 0.83 (+/-) 0.0 Training time: 2.44 (+/-) 0.05 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 12.8 (+/-) 1.47 25% Data
100% (2000 of 2000) |####################| Elapsed Time: 1:08:27 ETA: 00:00:00
=== Performance result === Accuracy: 83.3791895947974 (+/-) 1.9568576313739636 Precision: 0.8345751674362907 Recall: 0.833791895947974 F1 score: 0.8336937175688032 Testing Time: 0.0026152254403263644 (+/-) 0.0008853610062663889 Training Time: 2.0390662587601405 (+/-) 0.036691067704053185 === Average network evolution === Total hidden node: 15.6065 (+/-) 0.9506091468106121 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=16, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 16 No. of parameters : 492 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=16, out_features=2, bias=True) ) No. of inputs : 16 No. of output : 2 No. of parameters : 34
100% (2000 of 2000) |####################| Elapsed Time: 1:04:19 ETA: 00:00:00
=== Performance result === Accuracy: 82.60200100050025 (+/-) 2.4684514628226246 Precision: 0.8284026274994908 Recall: 0.8260200100050025 F1 score: 0.8257022978311099 Testing Time: 0.0025530441097166017 (+/-) 0.0010492917626780643 Training Time: 1.9143049403272192 (+/-) 0.1176206457623524 === Average network evolution === Total hidden node: 12.6615 (+/-) 0.9627656776183913 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=13, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 13 No. of parameters : 405 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=13, out_features=2, bias=True) ) No. of inputs : 13 No. of output : 2 No. of parameters : 28
100% (2000 of 2000) |####################| Elapsed Time: 0:59:52 ETA: 00:00:00
=== Performance result === Accuracy: 83.0495247623812 (+/-) 2.2751344229700856 Precision: 0.8310488045006306 Recall: 0.8304952476238119 F1 score: 0.8304236271559869 Testing Time: 0.0025563472625671356 (+/-) 0.0005716842513456687 Training Time: 1.7811180863039322 (+/-) 0.05706431943391897 === Average network evolution === Total hidden node: 13.73 (+/-) 0.9225508116087698 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 14 No. of parameters : 434 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (2000 of 2000) |####################| Elapsed Time: 1:00:12 ETA: 00:00:00
=== Performance result === Accuracy: 83.17868934467234 (+/-) 2.2626944062399446 Precision: 0.8326866153493208 Recall: 0.8317868934467234 F1 score: 0.8316721655407643 Testing Time: 0.002417790883776544 (+/-) 0.0005446077076096997 Training Time: 1.7914270170334878 (+/-) 0.0720721146030465 === Average network evolution === Total hidden node: 15.823 (+/-) 0.9201472708213616 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=16, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 16 No. of parameters : 492 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=16, out_features=2, bias=True) ) No. of inputs : 16 No. of output : 2 No. of parameters : 34
100% (2000 of 2000) |####################| Elapsed Time: 0:59:38 ETA: 00:00:00
=== Performance result === Accuracy: 82.56288144072036 (+/-) 1.8273457520949818 Precision: 0.8279498074644591 Recall: 0.8256288144072036 F1 score: 0.8253181887021902 Testing Time: 0.002401047196610085 (+/-) 0.0005454930307811034 Training Time: 1.7742607118846059 (+/-) 0.03826508549160752 === Average network evolution === Total hidden node: 10.455 (+/-) 0.6949640278460462 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 11 No. of parameters : 347 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
N/A% (0 of 2000) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 82.95 (+/-) 0.32 F1 score: 0.83 (+/-) 0.0 Precision: 0.83 (+/-) 0.0 Recall: 0.83 (+/-) 0.0 Training time: 1.86 (+/-) 0.1 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 14.0 (+/-) 1.9 Infinite Delay
100% (2000 of 2000) |####################| Elapsed Time: 0:48:35 ETA: 00:00:00
=== Performance result === Accuracy: 50.75472736368184 (+/-) 2.157684920305618 Precision: 0.5087542212437823 Recall: 0.5075472736368184 F1 score: 0.49043963303755456 Testing Time: 0.0023691895128548773 (+/-) 0.0005888000906879686 Training Time: 1.4410137990643348 (+/-) 0.02794844091858741 === Average network evolution === Total hidden node: 4.973 (+/-) 4.973 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=4, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 4 No. of parameters : 144 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=4, out_features=2, bias=True) ) No. of inputs : 4 No. of output : 2 No. of parameters : 10
100% (2000 of 2000) |####################| Elapsed Time: 0:48:47 ETA: 00:00:00
=== Performance result === Accuracy: 53.42706353176588 (+/-) 2.9209191731154887 Precision: 0.5383581662827182 Recall: 0.5342706353176588 F1 score: 0.5216177333439694 Testing Time: 0.0023720186910013846 (+/-) 0.0006796684575414441 Training Time: 1.4470708457275054 (+/-) 0.030957027793999028 === Average network evolution === Total hidden node: 5.7255 (+/-) 5.7255 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=5, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 5 No. of parameters : 173 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=5, out_features=2, bias=True) ) No. of inputs : 5 No. of output : 2 No. of parameters : 12
100% (2000 of 2000) |####################| Elapsed Time: 0:48:45 ETA: 00:00:00
=== Performance result === Accuracy: 51.96468234117058 (+/-) 3.7916159434170686 Precision: 0.5616023806753287 Recall: 0.5196468234117059 F1 score: 0.42086200549438646 Testing Time: 0.002382431702950169 (+/-) 0.0005195655447862198 Training Time: 1.4461388783552696 (+/-) 0.03516382186684283 === Average network evolution === Total hidden node: 3.97 (+/-) 3.97 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=4, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 4 No. of parameters : 144 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=4, out_features=2, bias=True) ) No. of inputs : 4 No. of output : 2 No. of parameters : 10
100% (2000 of 2000) |####################| Elapsed Time: 0:48:57 ETA: 00:00:00
=== Performance result === Accuracy: 51.49979989994998 (+/-) 2.401321122121272 Precision: 0.5163623994323763 Recall: 0.5149979989994997 F1 score: 0.5044768976762106 Testing Time: 0.0023855894669823313 (+/-) 0.0005291492071272844 Training Time: 1.4524227999161934 (+/-) 0.029726907650753864 === Average network evolution === Total hidden node: 11.9735 (+/-) 11.9735 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 12 No. of parameters : 376 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
100% (2000 of 2000) |####################| Elapsed Time: 0:49:04 ETA: 00:00:00
=== Performance result === Accuracy: 52.803251625812905 (+/-) 3.4481892983034697 Precision: 0.529688862025156 Recall: 0.5280325162581291 F1 score: 0.5214419466575607 Testing Time: 0.0023869110859293173 (+/-) 0.0005301014485385265 Training Time: 1.4558078914716759 (+/-) 0.03763215248394666 === Average network evolution === Total hidden node: 4.0665 (+/-) 4.0665 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=28, out_features=4, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 28 No. of nodes : 4 No. of parameters : 144 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=4, out_features=2, bias=True) ) No. of inputs : 4 No. of output : 2 No. of parameters : 10
========== Performance occupancy ========== Preq Accuracy: 52.09 (+/-) 0.94 F1 score: 0.49 (+/-) 0.04 Precision: 0.53 (+/-) 0.02 Recall: 0.52 (+/-) 0.01 Training time: 1.45 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 5.8 (+/-) 3.12
%run DEVDAN_susy.ipynb
Number of input: 18 Number of output: 2 Number of batch: 2000 All Data
100% (2000 of 2000) |####################| Elapsed Time: 1:31:29 ETA: 00:00:00
=== Performance result === Accuracy: 76.97398699349675 (+/-) 2.717476034821803 Precision: 0.7735739685058987 Recall: 0.7697398699349675 F1 score: 0.7666964423079438 Testing Time: 0.002090639922546112 (+/-) 0.00047567050193025374 Training Time: 2.7306137197073728 (+/-) 0.05769203579874548 === Average network evolution === Total hidden node: 11.838 (+/-) 2.8081588274169964 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=17, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 17 No. of parameters : 341 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=17, out_features=2, bias=True) ) No. of inputs : 17 No. of output : 2 No. of parameters : 36
100% (2000 of 2000) |####################| Elapsed Time: 1:31:38 ETA: 00:00:00
=== Performance result === Accuracy: 76.98519259629816 (+/-) 2.8082118647573395 Precision: 0.7734010443215474 Recall: 0.7698519259629815 F1 score: 0.7669291806356665 Testing Time: 0.002060988713885141 (+/-) 0.0004845275830802349 Training Time: 2.735074549928315 (+/-) 0.060216675398504636 === Average network evolution === Total hidden node: 14.298 (+/-) 1.1925585939483225 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=16, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 16 No. of parameters : 322 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=16, out_features=2, bias=True) ) No. of inputs : 16 No. of output : 2 No. of parameters : 34
100% (2000 of 2000) |####################| Elapsed Time: 1:31:46 ETA: 00:00:00
=== Performance result === Accuracy: 77.48504252126062 (+/-) 2.4199771756068436 Precision: 0.777128413816308 Recall: 0.7748504252126063 F1 score: 0.7726037654440399 Testing Time: 0.0021290051573333055 (+/-) 0.0004895105390619302 Training Time: 2.7388323221640802 (+/-) 0.06267061563097323 === Average network evolution === Total hidden node: 23.338 (+/-) 3.565915871133249 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=27, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 27 No. of parameters : 531 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=27, out_features=2, bias=True) ) No. of inputs : 27 No. of output : 2 No. of parameters : 56
100% (2000 of 2000) |####################| Elapsed Time: 1:47:16 ETA: 00:00:00
=== Performance result === Accuracy: 77.35272636318159 (+/-) 2.6901505116903968 Precision: 0.7752844998478545 Recall: 0.7735272636318159 F1 score: 0.7714931359734093 Testing Time: 0.0024418880964530115 (+/-) 0.0012863634862076206 Training Time: 3.202883983266658 (+/-) 0.363274596971193 === Average network evolution === Total hidden node: 23.5815 (+/-) 3.8351476829452085 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=27, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 27 No. of parameters : 531 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=27, out_features=2, bias=True) ) No. of inputs : 27 No. of output : 2 No. of parameters : 56
100% (2000 of 2000) |####################| Elapsed Time: 1:59:12 ETA: 00:00:00
=== Performance result === Accuracy: 77.4376188094047 (+/-) 2.5450902292264286 Precision: 0.7758681854082652 Recall: 0.7743761880940471 F1 score: 0.772490368800658 Testing Time: 0.002634201841750343 (+/-) 0.0011834921047449844 Training Time: 3.560552574623341 (+/-) 0.4193703165050777 === Average network evolution === Total hidden node: 21.0775 (+/-) 3.712343431041907 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=24, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 24 No. of parameters : 474 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=24, out_features=2, bias=True) ) No. of inputs : 24 No. of output : 2 No. of parameters : 50
========== Performance occupancy ========== Preq Accuracy: 77.25 (+/-) 0.22 F1 score: 0.77 (+/-) 0.0 Precision: 0.78 (+/-) 0.0 Recall: 0.77 (+/-) 0.0 Training time: 2.99 (+/-) 0.34 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 22.2 (+/-) 4.79 50% Data
100% (2000 of 2000) |####################| Elapsed Time: 1:23:51 ETA: 00:00:00
=== Performance result === Accuracy: 76.7576288144072 (+/-) 2.955108612851145 Precision: 0.7697308580460802 Recall: 0.7675762881440721 F1 score: 0.7652135475961187 Testing Time: 0.002299461083271433 (+/-) 0.0005664678502657382 Training Time: 2.501218373564376 (+/-) 0.05753238442573188 === Average network evolution === Total hidden node: 14.7215 (+/-) 2.4188711726753866 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=18, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 18 No. of parameters : 360 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=18, out_features=2, bias=True) ) No. of inputs : 18 No. of output : 2 No. of parameters : 38
100% (2000 of 2000) |####################| Elapsed Time: 1:23:41 ETA: 00:00:00
=== Performance result === Accuracy: 76.83806903451726 (+/-) 3.0423287717494185 Precision: 0.7699622027183309 Recall: 0.7683806903451725 F1 score: 0.7663218990684774 Testing Time: 0.002301635713562958 (+/-) 0.0005765694894401364 Training Time: 2.4960943728223213 (+/-) 0.050810992016208575 === Average network evolution === Total hidden node: 19.9865 (+/-) 4.16236924719564 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=25, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 25 No. of parameters : 493 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=25, out_features=2, bias=True) ) No. of inputs : 25 No. of output : 2 No. of parameters : 52
100% (2000 of 2000) |####################| Elapsed Time: 1:23:24 ETA: 00:00:00
=== Performance result === Accuracy: 76.8687843921961 (+/-) 2.891431524550962 Precision: 0.7700361331116216 Recall: 0.768687843921961 F1 score: 0.7667584671569725 Testing Time: 0.002316659065769457 (+/-) 0.001073298156615216 Training Time: 2.4877691871228964 (+/-) 0.05020084951208561 === Average network evolution === Total hidden node: 18.2715 (+/-) 4.753923405987942 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=23, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 23 No. of parameters : 455 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=23, out_features=2, bias=True) ) No. of inputs : 23 No. of output : 2 No. of parameters : 48
100% (2000 of 2000) |####################| Elapsed Time: 1:22:43 ETA: 00:00:00
=== Performance result === Accuracy: 77.01610805402701 (+/-) 2.6885227798363576 Precision: 0.7722616957989414 Recall: 0.7701610805402701 F1 score: 0.7678864605916301 Testing Time: 0.0022730450441743088 (+/-) 0.0005915107902712769 Training Time: 2.467009422718256 (+/-) 0.05937490229458213 === Average network evolution === Total hidden node: 18.6385 (+/-) 1.962604837964077 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=21, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 21 No. of parameters : 417 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=21, out_features=2, bias=True) ) No. of inputs : 21 No. of output : 2 No. of parameters : 44
100% (2000 of 2000) |####################| Elapsed Time: 1:23:22 ETA: 00:00:00
=== Performance result === Accuracy: 76.6591795897949 (+/-) 2.8653637957691664 Precision: 0.7684092597619984 Recall: 0.766591795897949 F1 score: 0.7643715158640759 Testing Time: 0.0022854710770225813 (+/-) 0.0012547480099249176 Training Time: 2.4861639875838493 (+/-) 0.11608222738610692 === Average network evolution === Total hidden node: 14.923 (+/-) 1.9738467519035008 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=17, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 17 No. of parameters : 341 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=17, out_features=2, bias=True) ) No. of inputs : 17 No. of output : 2 No. of parameters : 36
========== Performance occupancy ========== Preq Accuracy: 76.83 (+/-) 0.12 F1 score: 0.77 (+/-) 0.0 Precision: 0.77 (+/-) 0.0 Recall: 0.77 (+/-) 0.0 Training time: 2.49 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 20.8 (+/-) 2.99 25% Data
100% (2000 of 2000) |####################| Elapsed Time: 1:03:02 ETA: 00:00:00
=== Performance result === Accuracy: 75.77238619309655 (+/-) 3.7100471808282234 Precision: 0.7590134389026876 Recall: 0.7577238619309655 F1 score: 0.7555809434129982 Testing Time: 0.0022550687126781776 (+/-) 0.0028760468676830935 Training Time: 1.8771466827201748 (+/-) 0.23298772328929407 === Average network evolution === Total hidden node: 12.518 (+/-) 2.170639537095001 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 15 No. of parameters : 303 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
100% (2000 of 2000) |####################| Elapsed Time: 0:59:05 ETA: 00:00:00
=== Performance result === Accuracy: 75.79779889944973 (+/-) 3.7585780039643626 Precision: 0.7597817297730193 Recall: 0.7579779889944972 F1 score: 0.7555517265816827 Testing Time: 0.002085630747006499 (+/-) 0.0007181364372320873 Training Time: 1.7592337274622953 (+/-) 0.15554612589621944 === Average network evolution === Total hidden node: 15.2445 (+/-) 4.657544390556037 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=22, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 22 No. of parameters : 436 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=22, out_features=2, bias=True) ) No. of inputs : 22 No. of output : 2 No. of parameters : 46
100% (2000 of 2000) |####################| Elapsed Time: 0:56:38 ETA: 00:00:00
=== Performance result === Accuracy: 76.02711355677839 (+/-) 3.3185285200492847 Precision: 0.7618248566557027 Recall: 0.7602711355677839 F1 score: 0.7580390682321614 Testing Time: 0.002108743394715241 (+/-) 0.00045909816414511993 Training Time: 1.6863909511938282 (+/-) 0.03688547261906658 === Average network evolution === Total hidden node: 18.4645 (+/-) 4.655291585926708 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=24, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 24 No. of parameters : 474 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=24, out_features=2, bias=True) ) No. of inputs : 24 No. of output : 2 No. of parameters : 50
100% (2000 of 2000) |####################| Elapsed Time: 0:56:30 ETA: 00:00:00
=== Performance result === Accuracy: 75.65167583791896 (+/-) 3.6694847734011105 Precision: 0.7592734028028071 Recall: 0.7565167583791896 F1 score: 0.7535592229279618 Testing Time: 0.002000094533503324 (+/-) 0.00048576565723468896 Training Time: 1.6820502683125238 (+/-) 0.03796032489101617 === Average network evolution === Total hidden node: 12.18 (+/-) 2.3953287874527787 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=16, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 16 No. of parameters : 322 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=16, out_features=2, bias=True) ) No. of inputs : 16 No. of output : 2 No. of parameters : 34
100% (2000 of 2000) |####################| Elapsed Time: 0:56:48 ETA: 00:00:00
=== Performance result === Accuracy: 75.97173586793397 (+/-) 3.5715395119628583 Precision: 0.7614283004033627 Recall: 0.7597173586793396 F1 score: 0.7573857215221373 Testing Time: 0.002102606889305859 (+/-) 0.000496291871148963 Training Time: 1.6910256031097444 (+/-) 0.040681664005599846 === Average network evolution === Total hidden node: 19.8635 (+/-) 2.710695067690204 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=24, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 24 No. of parameters : 474 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=24, out_features=2, bias=True) ) No. of inputs : 24 No. of output : 2 No. of parameters : 50
N/A% (0 of 2000) | | Elapsed Time: 0:00:00 ETA: --:--:--
========== Performance occupancy ========== Preq Accuracy: 75.84 (+/-) 0.14 F1 score: 0.76 (+/-) 0.0 Precision: 0.76 (+/-) 0.0 Recall: 0.76 (+/-) 0.0 Training time: 1.74 (+/-) 0.07 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 20.2 (+/-) 3.92 Infinite Delay
100% (2000 of 2000) |####################| Elapsed Time: 0:46:12 ETA: 00:00:00
=== Performance result === Accuracy: 45.906403201600796 (+/-) 1.9756924608500392 Precision: 0.4997991165105068 Recall: 0.459064032016008 F1 score: 0.3064849628778757 Testing Time: 0.001980981807699199 (+/-) 0.00045426592244336174 Training Time: 1.371322394132972 (+/-) 0.02922666062749222 === Average network evolution === Total hidden node: 4.978 (+/-) 4.978 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=5, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 5 No. of parameters : 113 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=5, out_features=2, bias=True) ) No. of inputs : 5 No. of output : 2 No. of parameters : 12
100% (2000 of 2000) |####################| Elapsed Time: 0:46:24 ETA: 00:00:00
=== Performance result === Accuracy: 54.50350175087544 (+/-) 1.5744164434610863 Precision: 0.5519056696032544 Recall: 0.5450350175087544 F1 score: 0.4035039520778062 Testing Time: 0.001995535359614011 (+/-) 0.00045898116057726114 Training Time: 1.3772968506443315 (+/-) 0.03396331840364975 === Average network evolution === Total hidden node: 7.023 (+/-) 7.023 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=7, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 7 No. of parameters : 151 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=7, out_features=2, bias=True) ) No. of inputs : 7 No. of output : 2 No. of parameters : 16
100% (2000 of 2000) |####################| Elapsed Time: 0:46:57 ETA: 00:00:00
=== Performance result === Accuracy: 55.794447223611805 (+/-) 1.7546776424476087 Precision: 0.6309272672752251 Recall: 0.5579444722361181 F1 score: 0.4300309636989829 Testing Time: 0.0020052767682516796 (+/-) 0.00046534598983510086 Training Time: 1.393513302137519 (+/-) 0.06203093630588441 === Average network evolution === Total hidden node: 14.996 (+/-) 14.996 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 15 No. of parameters : 303 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
100% (2000 of 2000) |####################| Elapsed Time: 0:51:02 ETA: 00:00:00
=== Performance result === Accuracy: 48.443671835917954 (+/-) 2.942444173339367 Precision: 0.5214009114521715 Recall: 0.48443671835917956 F1 score: 0.43294794070822784 Testing Time: 0.002092533913059435 (+/-) 0.0006810886525140304 Training Time: 1.5154523513148939 (+/-) 0.14269708464366124 === Average network evolution === Total hidden node: 13.9965 (+/-) 13.9965 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 14 No. of parameters : 284 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (2000 of 2000) |####################| Elapsed Time: 0:50:07 ETA: 00:00:00
=== Performance result === Accuracy: 54.79664832416208 (+/-) 1.7087315920626034 Precision: 0.6370388289627408 Recall: 0.5479664832416208 F1 score: 0.3987836492203695 Testing Time: 0.002073227375253789 (+/-) 0.0004930975134833653 Training Time: 1.488355452445461 (+/-) 0.037557004461820656 === Average network evolution === Total hidden node: 14.994 (+/-) 14.994 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=18, out_features=15, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 18 No. of nodes : 15 No. of parameters : 303 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=15, out_features=2, bias=True) ) No. of inputs : 15 No. of output : 2 No. of parameters : 32
========== Performance occupancy ========== Preq Accuracy: 51.89 (+/-) 3.95 F1 score: 0.39 (+/-) 0.05 Precision: 0.57 (+/-) 0.06 Recall: 0.52 (+/-) 0.04 Training time: 1.43 (+/-) 0.06 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 11.2 (+/-) 4.31
%run DEVDAN_occupancy-ablation.ipynb
Number of input: 5 Number of output: 2 Number of batch: 20 Without Generative Phase
100% (20 of 20) |########################| Elapsed Time: 0:00:37 ETA: 00:00:00
=== Performance result === Accuracy: 93.57368421052632 (+/-) 12.156299735885836 Precision: 0.9354095930085677 Recall: 0.9357368421052632 F1 score: 0.9336798895720925 Testing Time: 0.0017215452696147718 (+/-) 0.00044726776982812177 Training Time: 1.955162148726614 (+/-) 0.029636624515251238 === Average network evolution === Total hidden node: 30.05 (+/-) 13.063211703099663 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=47, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 47 No. of parameters : 287 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=47, out_features=2, bias=True) ) No. of inputs : 47 No. of output : 2 No. of parameters : 96
100% (20 of 20) |########################| Elapsed Time: 0:00:39 ETA: 00:00:00
=== Performance result === Accuracy: 92.57368421052631 (+/-) 13.422951362077493 Precision: 0.9250918532028085 Recall: 0.9257368421052632 F1 score: 0.92299337444738 Testing Time: 0.0015000293129368832 (+/-) 0.0004957695660650428 Training Time: 2.0762069601761666 (+/-) 0.10863757866221659 === Average network evolution === Total hidden node: 29.5 (+/-) 10.924742559895861 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=45, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 45 No. of parameters : 275 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=45, out_features=2, bias=True) ) No. of inputs : 45 No. of output : 2 No. of parameters : 92
100% (20 of 20) |########################| Elapsed Time: 0:00:41 ETA: 00:00:00
=== Performance result === Accuracy: 93.1157894736842 (+/-) 12.79832485576036 Precision: 0.9320049652630612 Recall: 0.9311578947368421 F1 score: 0.9280837691227649 Testing Time: 0.0015570364500346937 (+/-) 0.000498296115491202 Training Time: 2.1909703831923637 (+/-) 0.14375092989333738 === Average network evolution === Total hidden node: 34.5 (+/-) 10.781929326423912 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=49, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 49 No. of parameters : 299 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=49, out_features=2, bias=True) ) No. of inputs : 49 No. of output : 2 No. of parameters : 100
100% (20 of 20) |########################| Elapsed Time: 0:00:38 ETA: 00:00:00
=== Performance result === Accuracy: 88.57368421052632 (+/-) 14.35621420941443 Precision: 0.8832427672236072 Recall: 0.8857368421052632 F1 score: 0.8782999480839504 Testing Time: 0.0016620912049946032 (+/-) 0.00046385915208133614 Training Time: 2.0400126733277975 (+/-) 0.10790820181123534 === Average network evolution === Total hidden node: 23.8 (+/-) 11.311940593903417 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=40, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 40 No. of parameters : 245 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=40, out_features=2, bias=True) ) No. of inputs : 40 No. of output : 2 No. of parameters : 82
100% (20 of 20) |########################| Elapsed Time: 0:00:37 ETA: 00:00:00
=== Performance result === Accuracy: 90.96842105263158 (+/-) 14.334428655350944 Precision: 0.9098439440373084 Recall: 0.9096842105263158 F1 score: 0.9045042669375507 Testing Time: 0.0015575634805779707 (+/-) 0.0005002158389697008 Training Time: 1.9723938264344867 (+/-) 0.04622126171582108 === Average network evolution === Total hidden node: 26.0 (+/-) 13.435028842544403 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=44, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 44 No. of parameters : 269 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=44, out_features=2, bias=True) ) No. of inputs : 44 No. of output : 2 No. of parameters : 90
========== Performance ========== Preq Accuracy: 91.76 (+/-) 1.82 F1 score: 0.91 (+/-) 0.02 Precision: 0.92 (+/-) 0.02 Recall: 0.92 (+/-) 0.02 Training time: 2.05 (+/-) 0.08 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 45.0 (+/-) 3.03 Without Node Growing
100% (20 of 20) |########################| Elapsed Time: 0:00:51 ETA: 00:00:00
=== Performance result === Accuracy: 93.22105263157894 (+/-) 11.061387509205563 Precision: 0.9310473512530334 Recall: 0.9322105263157895 F1 score: 0.9307915888167794 Testing Time: 0.0012961563311125104 (+/-) 0.0004605102345438681 Training Time: 2.7318730103342155 (+/-) 0.09849592600050304 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 2 No. of parameters : 17 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (20 of 20) |########################| Elapsed Time: 0:00:56 ETA: 00:00:00
=== Performance result === Accuracy: 88.03157894736842 (+/-) 17.007554130648096 Precision: 0.8794058440952276 Recall: 0.8803157894736842 F1 score: 0.8703573223073371 Testing Time: 0.0013403516066701788 (+/-) 0.0004936956945676099 Training Time: 2.9934796534086527 (+/-) 0.27580916333792205 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 2 No. of parameters : 17 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (20 of 20) |########################| Elapsed Time: 0:00:55 ETA: 00:00:00
=== Performance result === Accuracy: 91.09473684210526 (+/-) 14.63648914201873 Precision: 0.9097408367071526 Recall: 0.9109473684210526 F1 score: 0.906880608026281 Testing Time: 0.001617883381090666 (+/-) 0.00048469349308078385 Training Time: 2.929529779835751 (+/-) 0.25433039202432345 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 2 No. of parameters : 17 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (20 of 20) |########################| Elapsed Time: 0:00:55 ETA: 00:00:00
=== Performance result === Accuracy: 91.16315789473686 (+/-) 13.571805547427154 Precision: 0.9115771379144545 Recall: 0.9116315789473685 F1 score: 0.9068440903674022 Testing Time: 0.0014144872364244964 (+/-) 0.0005900485199699953 Training Time: 2.9057316403639946 (+/-) 0.21580917278991182 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 2 No. of parameters : 17 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (20 of 20) |########################| Elapsed Time: 0:00:53 ETA: 00:00:00
=== Performance result === Accuracy: 66.1263157894737 (+/-) 30.02423305660145 Precision: 0.6182089531389595 Recall: 0.6612631578947369 F1 score: 0.6375445494182905 Testing Time: 0.00150827357643529 (+/-) 0.0004937362762175094 Training Time: 2.827217127147474 (+/-) 0.09335132160517298 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 2 No. of parameters : 17 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
========== Performance ========== Preq Accuracy: 85.93 (+/-) 10.04 F1 score: 0.85 (+/-) 0.11 Precision: 0.85 (+/-) 0.12 Recall: 0.86 (+/-) 0.1 Training time: 2.88 (+/-) 0.09 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 2.0 (+/-) 0.0 Without Node Pruning
100% (20 of 20) |########################| Elapsed Time: 0:00:54 ETA: 00:00:00
=== Performance result === Accuracy: 89.34736842105264 (+/-) 16.77110197619241 Precision: 0.8942125703570543 Recall: 0.8934736842105263 F1 score: 0.8853101906658994 Testing Time: 0.0016607485319438734 (+/-) 0.0004672761631468815 Training Time: 2.8557581148649516 (+/-) 0.18121124774363737 === Average network evolution === Total hidden node: 41.45 (+/-) 15.419062876841771 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=60, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 60 No. of parameters : 365 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=60, out_features=2, bias=True) ) No. of inputs : 60 No. of output : 2 No. of parameters : 122
100% (20 of 20) |########################| Elapsed Time: 0:00:53 ETA: 00:00:00
=== Performance result === Accuracy: 92.82631578947368 (+/-) 12.987842173407007 Precision: 0.9269162360829145 Recall: 0.9282631578947368 F1 score: 0.9267992093359675 Testing Time: 0.0020866268559506067 (+/-) 0.0005534460880592887 Training Time: 2.8215924689644263 (+/-) 0.10037888092560249 === Average network evolution === Total hidden node: 49.8 (+/-) 10.514751542475931 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=63, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 63 No. of parameters : 383 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=63, out_features=2, bias=True) ) No. of inputs : 63 No. of output : 2 No. of parameters : 128
100% (20 of 20) |########################| Elapsed Time: 0:00:52 ETA: 00:00:00
=== Performance result === Accuracy: 93.08947368421053 (+/-) 12.632243403576508 Precision: 0.9303692142927361 Recall: 0.9308947368421052 F1 score: 0.9285607545462513 Testing Time: 0.0018228857140792044 (+/-) 0.00036562200567119145 Training Time: 2.7697612988321403 (+/-) 0.17622557889511117 === Average network evolution === Total hidden node: 39.35 (+/-) 14.468154685377124 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=58, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 58 No. of parameters : 353 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=58, out_features=2, bias=True) ) No. of inputs : 58 No. of output : 2 No. of parameters : 118
100% (20 of 20) |########################| Elapsed Time: 0:00:51 ETA: 00:00:00
=== Performance result === Accuracy: 92.71052631578945 (+/-) 12.117837862417279 Precision: 0.9257116259072304 Recall: 0.9271052631578948 F1 score: 0.9257348938734065 Testing Time: 0.0017140790035850124 (+/-) 0.000444109980173383 Training Time: 2.7278683311060856 (+/-) 0.06939978773878293 === Average network evolution === Total hidden node: 42.05 (+/-) 12.714067012565256 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=60, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 60 No. of parameters : 365 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=60, out_features=2, bias=True) ) No. of inputs : 60 No. of output : 2 No. of parameters : 122
100% (20 of 20) |########################| Elapsed Time: 0:00:50 ETA: 00:00:00
=== Performance result === Accuracy: 92.97894736842105 (+/-) 12.37584995262392 Precision: 0.9291277423948574 Recall: 0.9297894736842105 F1 score: 0.9274584390969114 Testing Time: 0.001758675826223273 (+/-) 0.000411551638846052 Training Time: 2.6792213038394324 (+/-) 0.018530827724186166 === Average network evolution === Total hidden node: 53.35 (+/-) 13.9329645086751 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=5, out_features=70, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 5 No. of nodes : 70 No. of parameters : 425 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=70, out_features=2, bias=True) ) No. of inputs : 70 No. of output : 2 No. of parameters : 142
========== Performance ========== Preq Accuracy: 92.19 (+/-) 1.43 F1 score: 0.92 (+/-) 0.02 Precision: 0.92 (+/-) 0.01 Recall: 0.92 (+/-) 0.01 Training time: 2.77 (+/-) 0.06 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 62.2 (+/-) 4.21
%run DEVDAN_creditcarddefault-ablation.ipynb
Number of input: 24 Number of output: 2 Number of batch: 30 Without Generative Phase
100% (30 of 30) |########################| Elapsed Time: 0:00:58 ETA: 00:00:00
=== Performance result === Accuracy: 80.75517241379309 (+/-) 2.4581442687799173 Precision: 0.7892019427565701 Recall: 0.8075517241379311 F1 score: 0.7686752798962437 Testing Time: 0.002075014443233095 (+/-) 0.00040529412902852056 Training Time: 2.014861509717744 (+/-) 0.042307318598226874 === Average network evolution === Total hidden node: 9.833333333333334 (+/-) 0.37267799624996495 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 10 No. of parameters : 274 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
100% (30 of 30) |########################| Elapsed Time: 0:00:58 ETA: 00:00:00
=== Performance result === Accuracy: 80.56206896551727 (+/-) 2.2478209821196953 Precision: 0.785122533452928 Recall: 0.8056206896551724 F1 score: 0.7671577677581873 Testing Time: 0.002462082895739325 (+/-) 0.0005675301232885047 Training Time: 2.0267319761473557 (+/-) 0.05960429204693319 === Average network evolution === Total hidden node: 10.966666666666667 (+/-) 0.9480975102218595 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 12 No. of parameters : 324 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
100% (30 of 30) |########################| Elapsed Time: 0:00:58 ETA: 00:00:00
=== Performance result === Accuracy: 80.82068965517243 (+/-) 2.267062402795488 Precision: 0.790507239445375 Recall: 0.8082068965517242 F1 score: 0.7693591866744017 Testing Time: 0.0021467455502214104 (+/-) 0.0003767043434993368 Training Time: 2.0179267011839768 (+/-) 0.05193487323095378 === Average network evolution === Total hidden node: 13.2 (+/-) 0.5416025603090641 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 14 No. of parameters : 374 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (30 of 30) |########################| Elapsed Time: 0:00:58 ETA: 00:00:00
=== Performance result === Accuracy: 80.4448275862069 (+/-) 2.172490420104516 Precision: 0.7836486405080502 Recall: 0.804448275862069 F1 score: 0.7644299192154417 Testing Time: 0.002009539768613618 (+/-) 0.00032250459829410766 Training Time: 2.017317089541205 (+/-) 0.01932895288136567 === Average network evolution === Total hidden node: 13.233333333333333 (+/-) 0.8034647195462634 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=14, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 14 No. of parameters : 374 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=14, out_features=2, bias=True) ) No. of inputs : 14 No. of output : 2 No. of parameters : 30
100% (30 of 30) |########################| Elapsed Time: 0:00:58 ETA: 00:00:00
=== Performance result === Accuracy: 80.64827586206897 (+/-) 2.3697258228799933 Precision: 0.7863984073661209 Recall: 0.8064827586206896 F1 score: 0.7687611865036567 Testing Time: 0.0022159280448124327 (+/-) 0.0005670748202569871 Training Time: 2.0239893485759866 (+/-) 0.04198224779478828 === Average network evolution === Total hidden node: 10.2 (+/-) 0.7916228058025278 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 11 No. of parameters : 299 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
========== Performance ========== Preq Accuracy: 80.65 (+/-) 0.13 F1 score: 0.77 (+/-) 0.0 Precision: 0.79 (+/-) 0.0 Recall: 0.81 (+/-) 0.0 Training time: 2.02 (+/-) 0.0 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 12.2 (+/-) 1.6 Without Node Growing
100% (30 of 30) |########################| Elapsed Time: 0:01:17 ETA: 00:00:00
=== Performance result === Accuracy: 80.21379310344827 (+/-) 2.4871704096923066 Precision: 0.7804464325282893 Recall: 0.8021379310344827 F1 score: 0.7593708743162471 Testing Time: 0.0020793964122903757 (+/-) 0.00047984794131673993 Training Time: 2.679647009948204 (+/-) 0.017859648861649674 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 2 No. of parameters : 74 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (30 of 30) |########################| Elapsed Time: 0:01:18 ETA: 00:00:00
=== Performance result === Accuracy: 80.14827586206896 (+/-) 2.4843816534023504 Precision: 0.780320280000189 Recall: 0.8014827586206896 F1 score: 0.7566657493213086 Testing Time: 0.0019522864243079875 (+/-) 0.00018179590242385895 Training Time: 2.714654642960121 (+/-) 0.05977906297909256 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 2 No. of parameters : 74 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (30 of 30) |########################| Elapsed Time: 0:01:21 ETA: 00:00:00C:\Users\SCSE\AppData\Local\Continuum\miniconda3\envs\stmicro\lib\site-packages\sklearn\metrics\_classification.py:1221: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result))
=== Performance result === Accuracy: 77.85517241379311 (+/-) 2.505247762115322 Precision: 0.6061427871581452 Recall: 0.7785517241379311 F1 score: 0.6816138984678044 Testing Time: 0.0020787798125168374 (+/-) 0.0006611789320584703 Training Time: 2.804120581725548 (+/-) 0.21483715292091082 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 2 No. of parameters : 74 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (30 of 30) |########################| Elapsed Time: 0:01:21 ETA: 00:00:00
=== Performance result === Accuracy: 79.60344827586206 (+/-) 2.7007639734764064 Precision: 0.7752990342897405 Recall: 0.7960344827586207 F1 score: 0.7405883226407914 Testing Time: 0.0020143574681775324 (+/-) 0.0004109426995956416 Training Time: 2.798151813704392 (+/-) 0.14502516273757177 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 2 No. of parameters : 74 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (30 of 30) |########################| Elapsed Time: 0:01:18 ETA: 00:00:00
=== Performance result === Accuracy: 80.25172413793103 (+/-) 2.554727961717818 Precision: 0.781668875361991 Recall: 0.8025172413793104 F1 score: 0.7591090951895206 Testing Time: 0.0021172885237068966 (+/-) 0.0003446694873858148 Training Time: 2.7147968637532203 (+/-) 0.026166999146322756 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 2 No. of parameters : 74 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
========== Performance ========== Preq Accuracy: 79.61 (+/-) 0.91 F1 score: 0.74 (+/-) 0.03 Precision: 0.74 (+/-) 0.07 Recall: 0.8 (+/-) 0.01 Training time: 2.74 (+/-) 0.05 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 2.0 (+/-) 0.0 Without Node Pruning
100% (30 of 30) |########################| Elapsed Time: 0:01:19 ETA: 00:00:00
=== Performance result === Accuracy: 80.03793103448277 (+/-) 3.1093201156282793 Precision: 0.7744572583511895 Recall: 0.8003793103448276 F1 score: 0.7674301387864683 Testing Time: 0.002382878599495723 (+/-) 0.0004924165011141951 Training Time: 2.7337934395362593 (+/-) 0.026392461399504807 === Average network evolution === Total hidden node: 40.733333333333334 (+/-) 5.21493581509452 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=42, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 42 No. of parameters : 1074 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=42, out_features=2, bias=True) ) No. of inputs : 42 No. of output : 2 No. of parameters : 86
100% (30 of 30) |########################| Elapsed Time: 0:01:20 ETA: 00:00:00
=== Performance result === Accuracy: 79.88620689655173 (+/-) 4.175026255373238 Precision: 0.7720012833977559 Recall: 0.7988620689655173 F1 score: 0.7638115328660888 Testing Time: 0.0022832525187525257 (+/-) 0.0005328204348280083 Training Time: 2.767561542576757 (+/-) 0.08062269161081284 === Average network evolution === Total hidden node: 27.0 (+/-) 1.632993161855452 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=28, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 28 No. of parameters : 724 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=28, out_features=2, bias=True) ) No. of inputs : 28 No. of output : 2 No. of parameters : 58
100% (30 of 30) |########################| Elapsed Time: 0:01:19 ETA: 00:00:00
=== Performance result === Accuracy: 80.46206896551725 (+/-) 2.3958892659207156 Precision: 0.7834171683539525 Recall: 0.8046206896551724 F1 score: 0.765676998189201 Testing Time: 0.002281567146038187 (+/-) 0.0004673891362354464 Training Time: 2.7292368247591217 (+/-) 0.03493085020252392 === Average network evolution === Total hidden node: 25.933333333333334 (+/-) 1.768866554856213 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=27, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 27 No. of parameters : 699 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=27, out_features=2, bias=True) ) No. of inputs : 27 No. of output : 2 No. of parameters : 56
100% (30 of 30) |########################| Elapsed Time: 0:01:19 ETA: 00:00:00
=== Performance result === Accuracy: 80.73103448275862 (+/-) 2.357980352682891 Precision: 0.787572980234372 Recall: 0.8073103448275862 F1 score: 0.7703556553199438 Testing Time: 0.002456730809705011 (+/-) 0.0005025091839931565 Training Time: 2.7260219228678735 (+/-) 0.029212104585768846 === Average network evolution === Total hidden node: 25.0 (+/-) 1.632993161855452 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=26, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 26 No. of parameters : 674 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=26, out_features=2, bias=True) ) No. of inputs : 26 No. of output : 2 No. of parameters : 54
100% (30 of 30) |########################| Elapsed Time: 0:01:19 ETA: 00:00:00
=== Performance result === Accuracy: 80.38275862068966 (+/-) 2.5136961336010466 Precision: 0.7830245863448029 Recall: 0.8038275862068965 F1 score: 0.7626970037369322 Testing Time: 0.0024259008210280844 (+/-) 0.0004980262960594254 Training Time: 2.7389900602143387 (+/-) 0.050530964967255135 === Average network evolution === Total hidden node: 38.733333333333334 (+/-) 5.597221532947296 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=24, out_features=40, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 24 No. of nodes : 40 No. of parameters : 1024 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=40, out_features=2, bias=True) ) No. of inputs : 40 No. of output : 2 No. of parameters : 82
========== Performance ========== Preq Accuracy: 80.3 (+/-) 0.3 F1 score: 0.77 (+/-) 0.0 Precision: 0.78 (+/-) 0.01 Recall: 0.8 (+/-) 0.0 Training time: 2.74 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 32.6 (+/-) 6.92
%run DEVDAN_rmnist-ablation.ipynb
Number of input: 784 Number of output: 10 Number of batch: 69 Without Generative Phase
100% (69 of 69) |########################| Elapsed Time: 0:02:38 ETA: 00:00:00
=== Performance result === Accuracy: 90.74558823529412 (+/-) 3.3949977832241487 Precision: 0.9074729874835634 Recall: 0.9074558823529412 F1 score: 0.9073866868023509 Testing Time: 0.016073510927312514 (+/-) 0.002530832043730629 Training Time: 2.318729702164145 (+/-) 0.05287749730356528 === Average network evolution === Total hidden node: 28.318840579710145 (+/-) 3.2814487504944223 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=33, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 33 No. of parameters : 26689 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=33, out_features=10, bias=True) ) No. of inputs : 33 No. of output : 10 No. of parameters : 340
100% (69 of 69) |########################| Elapsed Time: 0:02:37 ETA: 00:00:00
=== Performance result === Accuracy: 89.73823529411766 (+/-) 3.4354565332274984 Precision: 0.8973134271130122 Recall: 0.8973823529411765 F1 score: 0.8972720965098949 Testing Time: 0.015724255758173326 (+/-) 0.0020481363358425563 Training Time: 2.2998526446959553 (+/-) 0.048863628195707616 === Average network evolution === Total hidden node: 24.202898550724637 (+/-) 1.1109430664774957 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=25, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 25 No. of parameters : 20409 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=25, out_features=10, bias=True) ) No. of inputs : 25 No. of output : 10 No. of parameters : 260
100% (69 of 69) |########################| Elapsed Time: 0:02:38 ETA: 00:00:00
=== Performance result === Accuracy: 90.17794117647061 (+/-) 3.9301940375775835 Precision: 0.9020326225337904 Recall: 0.9017794117647059 F1 score: 0.9018472993806194 Testing Time: 0.015858702799853158 (+/-) 0.00249740114683838 Training Time: 2.3134974065948937 (+/-) 0.09613874162513862 === Average network evolution === Total hidden node: 29.594202898550726 (+/-) 1.9950579562226385 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=32, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 32 No. of parameters : 25904 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=32, out_features=10, bias=True) ) No. of inputs : 32 No. of output : 10 No. of parameters : 330
100% (69 of 69) |########################| Elapsed Time: 0:02:36 ETA: 00:00:00
=== Performance result === Accuracy: 89.70882352941176 (+/-) 3.85943211664304 Precision: 0.8969442578461202 Recall: 0.8970882352941176 F1 score: 0.8969661744703338 Testing Time: 0.015766459352829876 (+/-) 0.002882095631473873 Training Time: 2.2884698229677536 (+/-) 0.05335394177864605 === Average network evolution === Total hidden node: 18.942028985507246 (+/-) 0.3762537677028163 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=19, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 19 No. of parameters : 15699 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=19, out_features=10, bias=True) ) No. of inputs : 19 No. of output : 10 No. of parameters : 200
100% (69 of 69) |########################| Elapsed Time: 0:02:36 ETA: 00:00:00
=== Performance result === Accuracy: 89.67647058823529 (+/-) 3.7582895689966556 Precision: 0.896526854746833 Recall: 0.8967647058823529 F1 score: 0.8965610633230563 Testing Time: 0.01537406795165118 (+/-) 0.0028037546741805875 Training Time: 2.2901252963963676 (+/-) 0.0797856933058875 === Average network evolution === Total hidden node: 20.855072463768117 (+/-) 0.6432562614832505 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=21, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 21 No. of parameters : 17269 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=21, out_features=10, bias=True) ) No. of inputs : 21 No. of output : 10 No. of parameters : 220
========== Performance ========== Preq Accuracy: 90.01 (+/-) 0.41 F1 score: 0.9 (+/-) 0.0 Precision: 0.9 (+/-) 0.0 Recall: 0.9 (+/-) 0.0 Training time: 2.3 (+/-) 0.01 Testing time: 0.02 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 26.0 (+/-) 5.66 Without Node Growing
100% (69 of 69) |########################| Elapsed Time: 0:03:24 ETA: 00:00:00
=== Performance result === Accuracy: 85.11911764705881 (+/-) 4.245421663840499 Precision: 0.8510873879611457 Recall: 0.8511911764705883 F1 score: 0.8508511516471564 Testing Time: 0.015504121780395508 (+/-) 0.0021028568446617053 Training Time: 2.985935453106375 (+/-) 0.035079603371983584 === Average network evolution === Total hidden node: 10.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 10 No. of parameters : 8634 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=10, bias=True) ) No. of inputs : 10 No. of output : 10 No. of parameters : 110
100% (69 of 69) |########################| Elapsed Time: 0:03:24 ETA: 00:00:00
=== Performance result === Accuracy: 86.83676470588233 (+/-) 3.6540028176912376 Precision: 0.868112661877498 Recall: 0.8683676470588235 F1 score: 0.8680880114344237 Testing Time: 0.015471297151902142 (+/-) 0.0022369018031844316 Training Time: 2.9970511934336495 (+/-) 0.04090249126190453 === Average network evolution === Total hidden node: 10.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 10 No. of parameters : 8634 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=10, bias=True) ) No. of inputs : 10 No. of output : 10 No. of parameters : 110
100% (69 of 69) |########################| Elapsed Time: 0:03:24 ETA: 00:00:00
=== Performance result === Accuracy: 86.65882352941175 (+/-) 3.5627251340263166 Precision: 0.8661992415208806 Recall: 0.8665882352941177 F1 score: 0.8662011724239246 Testing Time: 0.015501029351178338 (+/-) 0.002537465877940737 Training Time: 2.983953461927526 (+/-) 0.039112842625619024 === Average network evolution === Total hidden node: 10.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 10 No. of parameters : 8634 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=10, bias=True) ) No. of inputs : 10 No. of output : 10 No. of parameters : 110
100% (69 of 69) |########################| Elapsed Time: 0:03:23 ETA: 00:00:00
=== Performance result === Accuracy: 87.05588235294118 (+/-) 3.6740097646665353 Precision: 0.8700764624304511 Recall: 0.8705588235294117 F1 score: 0.8701850518274112 Testing Time: 0.01537177843206069 (+/-) 0.002733976772446847 Training Time: 2.980056222747354 (+/-) 0.026054172130583173 === Average network evolution === Total hidden node: 10.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 10 No. of parameters : 8634 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=10, bias=True) ) No. of inputs : 10 No. of output : 10 No. of parameters : 110
100% (69 of 69) |########################| Elapsed Time: 0:03:23 ETA: 00:00:00
=== Performance result === Accuracy: 86.95147058823531 (+/-) 3.8222449765680446 Precision: 0.8694087001872666 Recall: 0.8695147058823529 F1 score: 0.8690925095868817 Testing Time: 0.015403796644771801 (+/-) 0.002844179121641937 Training Time: 2.972826775382547 (+/-) 0.028841240482971073 === Average network evolution === Total hidden node: 10.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 10 No. of parameters : 8634 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=10, bias=True) ) No. of inputs : 10 No. of output : 10 No. of parameters : 110
========== Performance ========== Preq Accuracy: 86.52 (+/-) 0.71 F1 score: 0.86 (+/-) 0.01 Precision: 0.86 (+/-) 0.01 Recall: 0.87 (+/-) 0.01 Training time: 2.98 (+/-) 0.01 Testing time: 0.02 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 10.0 (+/-) 0.0 Without Node Pruning
100% (69 of 69) |########################| Elapsed Time: 0:04:51 ETA: 00:00:00
=== Performance result === Accuracy: 90.94411764705883 (+/-) 4.56728720979395 Precision: 0.9094665761623717 Recall: 0.9094411764705882 F1 score: 0.9094234762693522 Testing Time: 0.01770170997170841 (+/-) 0.0029293736884793394 Training Time: 4.260714509907891 (+/-) 0.0853115246109465 === Average network evolution === Total hidden node: 65.6376811594203 (+/-) 3.1206233792549014 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=68, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 68 No. of parameters : 54164 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=68, out_features=10, bias=True) ) No. of inputs : 68 No. of output : 10 No. of parameters : 690
100% (69 of 69) |########################| Elapsed Time: 0:04:20 ETA: 00:00:00
=== Performance result === Accuracy: 89.63970588235291 (+/-) 3.6845520002988947 Precision: 0.896492834715607 Recall: 0.8963970588235294 F1 score: 0.8962698796242975 Testing Time: 0.017059203456429875 (+/-) 0.0031026038268762326 Training Time: 3.8178149742238663 (+/-) 0.4567385740052197 === Average network evolution === Total hidden node: 38.26086956521739 (+/-) 8.997023942591598 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=44, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 44 No. of parameters : 35324 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=44, out_features=10, bias=True) ) No. of inputs : 44 No. of output : 10 No. of parameters : 450
100% (69 of 69) |########################| Elapsed Time: 0:04:50 ETA: 00:00:00
=== Performance result === Accuracy: 91.00441176470588 (+/-) 4.4412317390406 Precision: 0.9102373840540025 Recall: 0.9100441176470588 F1 score: 0.9100179467337116 Testing Time: 0.016983239089741427 (+/-) 0.002016694740849741 Training Time: 4.245553668807535 (+/-) 0.0741036886837145 === Average network evolution === Total hidden node: 66.89855072463769 (+/-) 1.252764219418801 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=70, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 70 No. of parameters : 55734 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=70, out_features=10, bias=True) ) No. of inputs : 70 No. of output : 10 No. of parameters : 710
100% (69 of 69) |########################| Elapsed Time: 0:04:49 ETA: 00:00:00
=== Performance result === Accuracy: 91.63823529411764 (+/-) 3.9009059811854545 Precision: 0.9164836997315243 Recall: 0.9163823529411764 F1 score: 0.9163737730320589 Testing Time: 0.017834253170911002 (+/-) 0.002975104722708604 Training Time: 4.230127408223994 (+/-) 0.07497546593004517 === Average network evolution === Total hidden node: 62.08695652173913 (+/-) 0.8118529608209226 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=64, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 64 No. of parameters : 51024 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=64, out_features=10, bias=True) ) No. of inputs : 64 No. of output : 10 No. of parameters : 650
100% (69 of 69) |########################| Elapsed Time: 0:04:47 ETA: 00:00:00
=== Performance result === Accuracy: 91.88823529411764 (+/-) 3.9337352931280702 Precision: 0.9192029032083063 Recall: 0.9188823529411765 F1 score: 0.9189213501103894 Testing Time: 0.018216006896075082 (+/-) 0.004014107098851965 Training Time: 4.204413683975444 (+/-) 0.07302759036753052 === Average network evolution === Total hidden node: 57.65217391304348 (+/-) 1.5306159313082242 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=784, out_features=58, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 784 No. of nodes : 58 No. of parameters : 46314 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=58, out_features=10, bias=True) ) No. of inputs : 58 No. of output : 10 No. of parameters : 590
========== Performance ========== Preq Accuracy: 91.02 (+/-) 0.78 F1 score: 0.91 (+/-) 0.01 Precision: 0.91 (+/-) 0.01 Recall: 0.91 (+/-) 0.01 Training time: 4.15 (+/-) 0.17 Testing time: 0.02 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 60.8 (+/-) 9.35
%run DEVDAN_sea-ablation.ipynb
Number of input: 3 Number of output: 2 Number of batch: 100 Without Generative Phase
100% (100 of 100) |######################| Elapsed Time: 0:03:21 ETA: 00:00:00
=== Performance result === Accuracy: 92.27878787878788 (+/-) 6.08869249581057 Precision: 0.9227983320267883 Recall: 0.9227878787878788 F1 score: 0.9222126425947392 Testing Time: 0.001395562682488952 (+/-) 0.0004930580679292113 Training Time: 2.028026397782143 (+/-) 0.026065904831567614 === Average network evolution === Total hidden node: 22.48 (+/-) 11.036738648713214 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=41, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 41 No. of parameters : 167 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=41, out_features=2, bias=True) ) No. of inputs : 41 No. of output : 2 No. of parameters : 84
100% (100 of 100) |######################| Elapsed Time: 0:03:20 ETA: 00:00:00
=== Performance result === Accuracy: 92.26060606060607 (+/-) 5.909704707825987 Precision: 0.9226098862598247 Recall: 0.9226060606060607 F1 score: 0.9220315998095829 Testing Time: 0.0014971843873611604 (+/-) 0.0005213897086913434 Training Time: 2.0237452454037137 (+/-) 0.03478457141079413 === Average network evolution === Total hidden node: 24.03 (+/-) 10.923785973736393 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=43, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 43 No. of parameters : 175 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=43, out_features=2, bias=True) ) No. of inputs : 43 No. of output : 2 No. of parameters : 88
100% (100 of 100) |######################| Elapsed Time: 0:03:20 ETA: 00:00:00
=== Performance result === Accuracy: 91.7181818181818 (+/-) 7.115417371523792 Precision: 0.9173894371460026 Recall: 0.9171818181818182 F1 score: 0.9164032247390432 Testing Time: 0.0013841860222093987 (+/-) 0.0004912996344615757 Training Time: 2.0270115823456734 (+/-) 0.03971917899283722 === Average network evolution === Total hidden node: 15.64 (+/-) 8.498846980620371 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=30, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 30 No. of parameters : 123 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=30, out_features=2, bias=True) ) No. of inputs : 30 No. of output : 2 No. of parameters : 62
100% (100 of 100) |######################| Elapsed Time: 0:03:21 ETA: 00:00:00
=== Performance result === Accuracy: 92.1222222222222 (+/-) 5.9849568246597675 Precision: 0.9209550208931403 Recall: 0.9212222222222223 F1 score: 0.920894743656429 Testing Time: 0.0014550974874785452 (+/-) 0.0004998964343142926 Training Time: 2.0351246053522285 (+/-) 0.043320118009483 === Average network evolution === Total hidden node: 21.01 (+/-) 11.611627792863498 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=39, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 39 No. of parameters : 159 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=39, out_features=2, bias=True) ) No. of inputs : 39 No. of output : 2 No. of parameters : 80
100% (100 of 100) |######################| Elapsed Time: 0:03:24 ETA: 00:00:00
=== Performance result === Accuracy: 91.82727272727273 (+/-) 6.739987983411315 Precision: 0.9180374596979158 Recall: 0.9182727272727272 F1 score: 0.9181062647800619 Testing Time: 0.0015477170847883128 (+/-) 0.0004973741247772138 Training Time: 2.064640310075548 (+/-) 0.09904460273522589 === Average network evolution === Total hidden node: 22.32 (+/-) 14.969889779153352 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=47, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 47 No. of parameters : 191 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=47, out_features=2, bias=True) ) No. of inputs : 47 No. of output : 2 No. of parameters : 96
========== Performance ========== Preq Accuracy: 92.04 (+/-) 0.23 F1 score: 0.92 (+/-) 0.0 Precision: 0.92 (+/-) 0.0 Recall: 0.92 (+/-) 0.0 Training time: 2.04 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 40.0 (+/-) 5.66 Without Node Growing
100% (100 of 100) |######################| Elapsed Time: 0:04:26 ETA: 00:00:00
=== Performance result === Accuracy: 91.81313131313131 (+/-) 6.431419820066856 Precision: 0.9178442013534059 Recall: 0.9181313131313131 F1 score: 0.9177606499358827 Testing Time: 0.0014143929337010238 (+/-) 0.0004980447948747212 Training Time: 2.691780292626583 (+/-) 0.028998377744018197 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 2 No. of parameters : 11 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (100 of 100) |######################| Elapsed Time: 0:04:28 ETA: 00:00:00
=== Performance result === Accuracy: 62.74747474747475 (+/-) 7.867980673954783 Precision: 0.5661861988636364 Recall: 0.6274747474747475 F1 score: 0.508939710278005 Testing Time: 0.0014054582576559047 (+/-) 0.0005008244700854948 Training Time: 2.7096815277831725 (+/-) 0.05452799572566072 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 2 No. of parameters : 11 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (100 of 100) |######################| Elapsed Time: 0:04:29 ETA: 00:00:00
=== Performance result === Accuracy: 91.67171717171718 (+/-) 7.184200403326711 Precision: 0.9166007049466466 Recall: 0.9167171717171717 F1 score: 0.9161164302685715 Testing Time: 0.0013790130615234375 (+/-) 0.0004890834933230784 Training Time: 2.720866499525128 (+/-) 0.059100067244273406 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 2 No. of parameters : 11 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (100 of 100) |######################| Elapsed Time: 0:04:28 ETA: 00:00:00
=== Performance result === Accuracy: 91.49393939393939 (+/-) 6.891743239936887 Precision: 0.9146924653814028 Recall: 0.9149393939393939 F1 score: 0.9147686950269358 Testing Time: 0.0012775551189075816 (+/-) 0.0004573904328600576 Training Time: 2.710835081158262 (+/-) 0.03225102146056753 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 2 No. of parameters : 11 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (100 of 100) |######################| Elapsed Time: 0:04:27 ETA: 00:00:00
=== Performance result === Accuracy: 92.05454545454543 (+/-) 6.373893609316316 Precision: 0.9202795774102572 Recall: 0.9205454545454546 F1 score: 0.9203167576495773 Testing Time: 0.0013533794518673058 (+/-) 0.0004867193479383472 Training Time: 2.7031733845219468 (+/-) 0.03619830647806712 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 2 No. of parameters : 11 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
========== Performance ========== Preq Accuracy: 85.96 (+/-) 11.61 F1 score: 0.84 (+/-) 0.16 Precision: 0.85 (+/-) 0.14 Recall: 0.86 (+/-) 0.12 Training time: 2.71 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 2.0 (+/-) 0.0 Without Node Pruning
100% (100 of 100) |######################| Elapsed Time: 0:04:29 ETA: 00:00:00
=== Performance result === Accuracy: 92.26666666666667 (+/-) 5.848093879222701 Precision: 0.9224183782829053 Recall: 0.9226666666666666 F1 score: 0.9223276858285956 Testing Time: 0.0016037695335619378 (+/-) 0.000523361557737617 Training Time: 2.7228738707725446 (+/-) 0.05765343349300195 === Average network evolution === Total hidden node: 30.36 (+/-) 13.565780478837183 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=52, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 52 No. of parameters : 211 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=52, out_features=2, bias=True) ) No. of inputs : 52 No. of output : 2 No. of parameters : 106
100% (100 of 100) |######################| Elapsed Time: 0:04:27 ETA: 00:00:00
=== Performance result === Accuracy: 92.01919191919191 (+/-) 6.170763205546344 Precision: 0.920056916978132 Recall: 0.9201919191919192 F1 score: 0.9196699131790155 Testing Time: 0.0014270724672259707 (+/-) 0.0004975809934545357 Training Time: 2.699920324364094 (+/-) 0.040659852940174 === Average network evolution === Total hidden node: 26.4 (+/-) 13.384319183283104 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=48, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 48 No. of parameters : 195 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=48, out_features=2, bias=True) ) No. of inputs : 48 No. of output : 2 No. of parameters : 98
100% (100 of 100) |######################| Elapsed Time: 0:04:28 ETA: 00:00:00
=== Performance result === Accuracy: 92.0949494949495 (+/-) 6.306888549773765 Precision: 0.9206804588201766 Recall: 0.920949494949495 F1 score: 0.9206186199425602 Testing Time: 0.0015178280647354897 (+/-) 0.0004982504906615172 Training Time: 2.710491556109804 (+/-) 0.05266129989289533 === Average network evolution === Total hidden node: 28.96 (+/-) 13.352093468816042 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=50, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 50 No. of parameters : 203 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=50, out_features=2, bias=True) ) No. of inputs : 50 No. of output : 2 No. of parameters : 102
100% (100 of 100) |######################| Elapsed Time: 0:04:28 ETA: 00:00:00
=== Performance result === Accuracy: 91.93030303030302 (+/-) 6.070286208709386 Precision: 0.9194156050227037 Recall: 0.9193030303030303 F1 score: 0.9186110102668115 Testing Time: 0.0016246540377838443 (+/-) 0.00047985255511146164 Training Time: 2.704034027427134 (+/-) 0.03540775645141918 === Average network evolution === Total hidden node: 29.66 (+/-) 9.55951881634217 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=45, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 45 No. of parameters : 183 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=45, out_features=2, bias=True) ) No. of inputs : 45 No. of output : 2 No. of parameters : 92
100% (100 of 100) |######################| Elapsed Time: 0:04:27 ETA: 00:00:00
=== Performance result === Accuracy: 92.12121212121212 (+/-) 6.0422965853813935 Precision: 0.921024633425642 Recall: 0.9212121212121213 F1 score: 0.9207560055110752 Testing Time: 0.0014768056195191663 (+/-) 0.0005007150460000052 Training Time: 2.702960288885868 (+/-) 0.029190661705202983 === Average network evolution === Total hidden node: 28.15 (+/-) 9.982359440533084 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=3, out_features=45, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 3 No. of nodes : 45 No. of parameters : 183 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=45, out_features=2, bias=True) ) No. of inputs : 45 No. of output : 2 No. of parameters : 92
========== Performance ========== Preq Accuracy: 92.09 (+/-) 0.11 F1 score: 0.92 (+/-) 0.0 Precision: 0.92 (+/-) 0.0 Recall: 0.92 (+/-) 0.0 Training time: 2.71 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 48.0 (+/-) 2.76
%run DEVDAN_weather-ablation.ipynb
Number of input: 8 Number of output: 2 Number of batch: 18 Without Generative Phase
100% (18 of 18) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 73.32352941176471 (+/-) 3.6269464407996237 Precision: 0.717146586666287 Recall: 0.7332352941176471 F1 score: 0.7154387322390723 Testing Time: 0.0018020938424503103 (+/-) 0.00037335248921221173 Training Time: 2.02371389725629 (+/-) 0.021460935868924527 === Average network evolution === Total hidden node: 7.0 (+/-) 0.8819171036881969 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 10 No. of parameters : 98 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
100% (18 of 18) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 72.30000000000001 (+/-) 4.568691407445963 Precision: 0.7061116336041033 Recall: 0.723 F1 score: 0.70720626951906 Testing Time: 0.0017394879285027 (+/-) 0.00043316443577439675 Training Time: 2.033839506261489 (+/-) 0.04669395999362822 === Average network evolution === Total hidden node: 10.555555555555555 (+/-) 1.0122703976826999 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=13, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 13 No. of parameters : 125 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=13, out_features=2, bias=True) ) No. of inputs : 13 No. of output : 2 No. of parameters : 28
100% (18 of 18) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 73.12941176470588 (+/-) 4.088544211773464 Precision: 0.7324851113482681 Recall: 0.7312941176470589 F1 score: 0.7318705246657892 Testing Time: 0.0015668869018554688 (+/-) 0.0004894101802008571 Training Time: 2.0170879504259895 (+/-) 0.016408703337512886 === Average network evolution === Total hidden node: 10.166666666666666 (+/-) 0.8333333333333334 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 12 No. of parameters : 116 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
100% (18 of 18) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 72.50588235294117 (+/-) 3.6658151470091216 Precision: 0.7185570805630662 Recall: 0.7250588235294118 F1 score: 0.7211381018695603 Testing Time: 0.001514448839075425 (+/-) 0.0004991862972807128 Training Time: 2.0181839325848747 (+/-) 0.015580035673095597 === Average network evolution === Total hidden node: 7.555555555555555 (+/-) 0.7617394000445604 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=9, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 9 No. of parameters : 89 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=9, out_features=2, bias=True) ) No. of inputs : 9 No. of output : 2 No. of parameters : 20
100% (18 of 18) |########################| Elapsed Time: 0:00:34 ETA: 00:00:00
=== Performance result === Accuracy: 74.7 (+/-) 3.1546417714032704 Precision: 0.7360842438750544 Recall: 0.747 F1 score: 0.7379007733762636 Testing Time: 0.0016206713283763213 (+/-) 0.00047401652175644394 Training Time: 2.0262389183044434 (+/-) 0.030630162510188352 === Average network evolution === Total hidden node: 8.38888888888889 (+/-) 1.1613636089092707 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=11, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 11 No. of parameters : 107 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=11, out_features=2, bias=True) ) No. of inputs : 11 No. of output : 2 No. of parameters : 24
========== Performance ========== Preq Accuracy: 73.19 (+/-) 0.84 F1 score: 0.72 (+/-) 0.01 Precision: 0.72 (+/-) 0.01 Recall: 0.73 (+/-) 0.01 Training time: 2.02 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 11.0 (+/-) 1.41 Without Node Growing
100% (18 of 18) |########################| Elapsed Time: 0:00:46 ETA: 00:00:00C:\Users\SCSE\AppData\Local\Continuum\miniconda3\envs\stmicro\lib\site-packages\sklearn\metrics\_classification.py:1221: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result))
=== Performance result === Accuracy: 68.60000000000001 (+/-) 4.105806505282918 Precision: 0.470596 Recall: 0.686 F1 score: 0.5582396204033215 Testing Time: 0.0011524032143985525 (+/-) 0.00038346635940878177 Training Time: 2.7305170367745792 (+/-) 0.05466876775781112 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 2 No. of parameters : 26 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (18 of 18) |########################| Elapsed Time: 0:00:46 ETA: 00:00:00
=== Performance result === Accuracy: 72.31176470588235 (+/-) 3.5308526470563613 Precision: 0.7052876317906593 Recall: 0.7231176470588235 F1 score: 0.705165319208252 Testing Time: 0.001453722224516027 (+/-) 0.0004898616459168563 Training Time: 2.7118058064404655 (+/-) 0.03708563594086276 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 2 No. of parameters : 26 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (18 of 18) |########################| Elapsed Time: 0:00:45 ETA: 00:00:00
=== Performance result === Accuracy: 72.49411764705883 (+/-) 3.7126921559675123 Precision: 0.7079655553845364 Recall: 0.7249411764705882 F1 score: 0.7083938635311748 Testing Time: 0.0016226488001206342 (+/-) 0.0004860521421664403 Training Time: 2.697818770128138 (+/-) 0.014741974455908125 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 2 No. of parameters : 26 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (18 of 18) |########################| Elapsed Time: 0:00:45 ETA: 00:00:00C:\Users\SCSE\AppData\Local\Continuum\miniconda3\envs\stmicro\lib\site-packages\sklearn\metrics\_classification.py:1221: UndefinedMetricWarning: Precision is ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior. _warn_prf(average, modifier, msg_start, len(result))
=== Performance result === Accuracy: 68.60000000000001 (+/-) 4.105806505282918 Precision: 0.470596 Recall: 0.686 F1 score: 0.5582396204033215 Testing Time: 0.00157468459185432 (+/-) 0.0004927724404940576 Training Time: 2.694782397326301 (+/-) 0.012845652896684082 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 2 No. of parameters : 26 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (18 of 18) |########################| Elapsed Time: 0:00:46 ETA: 00:00:00
=== Performance result === Accuracy: 74.19411764705882 (+/-) 3.090872823279629 Precision: 0.7288286982528666 Recall: 0.7419411764705882 F1 score: 0.7294950874873668 Testing Time: 0.001450945349300609 (+/-) 0.0004983682412269472 Training Time: 2.7258081436157227 (+/-) 0.06308574827617441 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 2 No. of parameters : 26 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
========== Performance ========== Preq Accuracy: 71.24 (+/-) 2.25 F1 score: 0.65 (+/-) 0.08 Precision: 0.62 (+/-) 0.12 Recall: 0.71 (+/-) 0.02 Training time: 2.71 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 2.0 (+/-) 0.0 Without Node Pruning
100% (18 of 18) |########################| Elapsed Time: 0:00:46 ETA: 00:00:00
=== Performance result === Accuracy: 74.85882352941178 (+/-) 3.664946644909663 Precision: 0.7409542655066169 Recall: 0.7485882352941177 F1 score: 0.7433799268800994 Testing Time: 0.001797044978422277 (+/-) 0.0005068418728250268 Training Time: 2.733143820482142 (+/-) 0.060492400574404546 === Average network evolution === Total hidden node: 18.555555555555557 (+/-) 2.3856567281759875 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=21, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 21 No. of parameters : 197 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=21, out_features=2, bias=True) ) No. of inputs : 21 No. of output : 2 No. of parameters : 44
100% (18 of 18) |########################| Elapsed Time: 0:00:46 ETA: 00:00:00
=== Performance result === Accuracy: 73.34705882352942 (+/-) 3.697871827871178 Precision: 0.7229257587608933 Recall: 0.7334705882352941 F1 score: 0.7257959466108125 Testing Time: 0.001614528543808881 (+/-) 0.0004771757342794189 Training Time: 2.7041431735543644 (+/-) 0.04836136542327823 === Average network evolution === Total hidden node: 16.833333333333332 (+/-) 1.8027756377319946 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=20, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 20 No. of parameters : 188 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=20, out_features=2, bias=True) ) No. of inputs : 20 No. of output : 2 No. of parameters : 42
100% (18 of 18) |########################| Elapsed Time: 0:00:45 ETA: 00:00:00
=== Performance result === Accuracy: 72.90588235294118 (+/-) 5.019194989149192 Precision: 0.7258622892084874 Recall: 0.7290588235294118 F1 score: 0.7273079509111081 Testing Time: 0.0013945803922765395 (+/-) 0.0004953211438070998 Training Time: 2.698570630129646 (+/-) 0.031339594367953194 === Average network evolution === Total hidden node: 9.333333333333334 (+/-) 2.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=12, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 12 No. of parameters : 116 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=12, out_features=2, bias=True) ) No. of inputs : 12 No. of output : 2 No. of parameters : 26
100% (18 of 18) |########################| Elapsed Time: 0:00:45 ETA: 00:00:00
=== Performance result === Accuracy: 74.2529411764706 (+/-) 2.7527777729282876 Precision: 0.7295687178580386 Recall: 0.7425294117647059 F1 score: 0.73028971472955 Testing Time: 0.0017963577719295725 (+/-) 0.0003798953793357341 Training Time: 2.6945171776939842 (+/-) 0.011669351608103885 === Average network evolution === Total hidden node: 14.0 (+/-) 0.9428090415820634 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=16, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 16 No. of parameters : 152 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=16, out_features=2, bias=True) ) No. of inputs : 16 No. of output : 2 No. of parameters : 34
100% (18 of 18) |########################| Elapsed Time: 0:00:46 ETA: 00:00:00
=== Performance result === Accuracy: 73.96470588235294 (+/-) 2.4996331910833223 Precision: 0.7265793328768682 Recall: 0.7396470588235294 F1 score: 0.7277933837706771 Testing Time: 0.0017454203437356388 (+/-) 0.0004240404233773277 Training Time: 2.707971418605131 (+/-) 0.04007195905055027 === Average network evolution === Total hidden node: 13.166666666666666 (+/-) 2.034425935955617 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=16, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 16 No. of parameters : 152 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=16, out_features=2, bias=True) ) No. of inputs : 16 No. of output : 2 No. of parameters : 34
========== Performance ========== Preq Accuracy: 73.87 (+/-) 0.68 F1 score: 0.73 (+/-) 0.01 Precision: 0.73 (+/-) 0.01 Recall: 0.74 (+/-) 0.01 Training time: 2.71 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 17.0 (+/-) 3.22
%run DEVDAN_electricitypricing-ablation.ipynb
Number of input: 8 Number of output: 2 Number of batch: 45 Without Generative Phase
100% (45 of 45) |########################| Elapsed Time: 0:01:29 ETA: 00:00:00
=== Performance result === Accuracy: 68.16363636363636 (+/-) 6.779551168741687 Precision: 0.6796222056044412 Recall: 0.6816363636363636 F1 score: 0.6802467331972326 Testing Time: 0.0016081116416237571 (+/-) 0.00048428904177932403 Training Time: 2.032021262428977 (+/-) 0.037695231628309185 === Average network evolution === Total hidden node: 11.555555555555555 (+/-) 1.9726525352153024 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=13, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 13 No. of parameters : 125 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=13, out_features=2, bias=True) ) No. of inputs : 13 No. of output : 2 No. of parameters : 28
100% (45 of 45) |########################| Elapsed Time: 0:01:29 ETA: 00:00:00
=== Performance result === Accuracy: 68.55909090909091 (+/-) 7.071560139737013 Precision: 0.6830235440955863 Recall: 0.6855909090909091 F1 score: 0.6835298182858075 Testing Time: 0.0017789927395907316 (+/-) 0.00045360356112352434 Training Time: 2.0232455188577827 (+/-) 0.022242758710080825 === Average network evolution === Total hidden node: 5.866666666666666 (+/-) 0.8055363982396381 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=7, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 7 No. of parameters : 71 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=7, out_features=2, bias=True) ) No. of inputs : 7 No. of output : 2 No. of parameters : 16
100% (45 of 45) |########################| Elapsed Time: 0:01:29 ETA: 00:00:00
=== Performance result === Accuracy: 68.08409090909089 (+/-) 7.255591673818896 Precision: 0.677266842340992 Recall: 0.6808409090909091 F1 score: 0.6767609054950063 Testing Time: 0.001635101708498868 (+/-) 0.00047824708938723964 Training Time: 2.0259444551034407 (+/-) 0.03580335731647419 === Average network evolution === Total hidden node: 11.977777777777778 (+/-) 1.2380789579719216 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=13, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 13 No. of parameters : 125 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=13, out_features=2, bias=True) ) No. of inputs : 13 No. of output : 2 No. of parameters : 28
100% (45 of 45) |########################| Elapsed Time: 0:01:28 ETA: 00:00:00
=== Performance result === Accuracy: 68.48409090909091 (+/-) 7.7779683951126914 Precision: 0.6812923476078304 Recall: 0.6848409090909091 F1 score: 0.6796748413792804 Testing Time: 0.0016808455640619452 (+/-) 0.0004581021234020211 Training Time: 2.0156031413511797 (+/-) 0.021503036256615385 === Average network evolution === Total hidden node: 9.244444444444444 (+/-) 1.0144633076011846 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 10 No. of parameters : 98 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
100% (45 of 45) |########################| Elapsed Time: 0:01:29 ETA: 00:00:00
=== Performance result === Accuracy: 67.92272727272729 (+/-) 6.4734374610390315 Precision: 0.6754409192883752 Recall: 0.6792272727272727 F1 score: 0.6730382504720993 Testing Time: 0.0014806281436573374 (+/-) 0.0005044863602579456 Training Time: 2.0257754000750454 (+/-) 0.03340499801956638 === Average network evolution === Total hidden node: 8.155555555555555 (+/-) 2.0865234834320887 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=10, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 10 No. of parameters : 98 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=10, out_features=2, bias=True) ) No. of inputs : 10 No. of output : 2 No. of parameters : 22
========== Performance ========== Preq Accuracy: 68.24 (+/-) 0.24 F1 score: 0.68 (+/-) 0.0 Precision: 0.68 (+/-) 0.0 Recall: 0.68 (+/-) 0.0 Training time: 2.02 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 10.6 (+/-) 2.24 Without Node Growing
100% (45 of 45) |########################| Elapsed Time: 0:02:01 ETA: 00:00:00
=== Performance result === Accuracy: 68.4659090909091 (+/-) 6.573360258090577 Precision: 0.6818972331440835 Recall: 0.6846590909090909 F1 score: 0.6823283726238462 Testing Time: 0.0015494660897688432 (+/-) 0.0004913935251121107 Training Time: 2.753814561800523 (+/-) 0.11748496481473741 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 2 No. of parameters : 26 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (45 of 45) |########################| Elapsed Time: 0:01:58 ETA: 00:00:00
=== Performance result === Accuracy: 67.61363636363637 (+/-) 5.517887890928875 Precision: 0.6754462559804574 Recall: 0.6761363636363636 F1 score: 0.6619221116730928 Testing Time: 0.0014404491944746537 (+/-) 0.0005070220445672582 Training Time: 2.6986123485998674 (+/-) 0.028780528145456872 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 2 No. of parameters : 26 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (45 of 45) |########################| Elapsed Time: 0:02:00 ETA: 00:00:00
=== Performance result === Accuracy: 66.54318181818181 (+/-) 6.346237171715964 Precision: 0.6609678450088046 Recall: 0.6654318181818182 F1 score: 0.6570960355615786 Testing Time: 0.0015094659545204856 (+/-) 0.0005003074585927562 Training Time: 2.726659216664054 (+/-) 0.04762583667130423 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 2 No. of parameters : 26 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (45 of 45) |########################| Elapsed Time: 0:01:59 ETA: 00:00:00
=== Performance result === Accuracy: 59.83636363636364 (+/-) 6.767942177819935 Precision: 0.5886892969051388 Recall: 0.5983636363636363 F1 score: 0.5876577890245466 Testing Time: 0.0014139901507984507 (+/-) 0.0004965873925558917 Training Time: 2.7038654685020447 (+/-) 0.03197107613745281 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 2 No. of parameters : 26 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
100% (45 of 45) |########################| Elapsed Time: 0:01:59 ETA: 00:00:00
=== Performance result === Accuracy: 65.86590909090908 (+/-) 7.413070250388167 Precision: 0.6549165834345053 Recall: 0.6586590909090909 F1 score: 0.6554079500954676 Testing Time: 0.001485830003565008 (+/-) 0.0005010854025706715 Training Time: 2.7098382169550117 (+/-) 0.036503107550605186 === Average network evolution === Total hidden node: 2.0 (+/-) 0.0 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=2, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 2 No. of parameters : 26 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=2, out_features=2, bias=True) ) No. of inputs : 2 No. of output : 2 No. of parameters : 6
========== Performance ========== Preq Accuracy: 65.67 (+/-) 3.05 F1 score: 0.65 (+/-) 0.03 Precision: 0.65 (+/-) 0.03 Recall: 0.66 (+/-) 0.03 Training time: 2.72 (+/-) 0.02 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 2.0 (+/-) 0.0 Without Node Pruning
100% (45 of 45) |########################| Elapsed Time: 0:01:59 ETA: 00:00:00
=== Performance result === Accuracy: 67.6340909090909 (+/-) 7.703686932590908 Precision: 0.6729065601128338 Recall: 0.676340909090909 F1 score: 0.6730172902716484 Testing Time: 0.0017032189802689986 (+/-) 0.0004479726902760726 Training Time: 2.7149289629676123 (+/-) 0.03997734755367686 === Average network evolution === Total hidden node: 17.466666666666665 (+/-) 2.49087222563592 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=20, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 20 No. of parameters : 188 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=20, out_features=2, bias=True) ) No. of inputs : 20 No. of output : 2 No. of parameters : 42
100% (45 of 45) |########################| Elapsed Time: 0:01:58 ETA: 00:00:00
=== Performance result === Accuracy: 67.27499999999999 (+/-) 7.092701914964808 Precision: 0.6687702446639067 Recall: 0.67275 F1 score: 0.6680440931347955 Testing Time: 0.0017780769955028188 (+/-) 0.0004032561564487275 Training Time: 2.6993932832371104 (+/-) 0.02849194179461089 === Average network evolution === Total hidden node: 11.71111111111111 (+/-) 1.6278441397166612 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=13, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 13 No. of parameters : 125 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=13, out_features=2, bias=True) ) No. of inputs : 13 No. of output : 2 No. of parameters : 28
100% (45 of 45) |########################| Elapsed Time: 0:01:59 ETA: 00:00:00
=== Performance result === Accuracy: 69.30909090909091 (+/-) 7.0455923740184145 Precision: 0.6904108642859478 Recall: 0.6930909090909091 F1 score: 0.690698709534611 Testing Time: 0.002131386236710982 (+/-) 0.0024418435481554056 Training Time: 2.7083996805277737 (+/-) 0.03501783156360387 === Average network evolution === Total hidden node: 22.08888888888889 (+/-) 2.8969119360850613 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=25, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 25 No. of parameters : 233 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=25, out_features=2, bias=True) ) No. of inputs : 25 No. of output : 2 No. of parameters : 52
100% (45 of 45) |########################| Elapsed Time: 0:01:59 ETA: 00:00:00
=== Performance result === Accuracy: 68.67045454545456 (+/-) 6.91613987007777 Precision: 0.6834212766380879 Recall: 0.6867045454545454 F1 score: 0.6831148238189384 Testing Time: 0.0015842210162769663 (+/-) 0.0004870221919743115 Training Time: 2.7041912620717827 (+/-) 0.02951922653926562 === Average network evolution === Total hidden node: 17.22222222222222 (+/-) 1.8121673811444547 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=19, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 19 No. of parameters : 179 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=19, out_features=2, bias=True) ) No. of inputs : 19 No. of output : 2 No. of parameters : 40
100% (45 of 45) |########################| Elapsed Time: 0:01:59 ETA: 00:00:00
=== Performance result === Accuracy: 68.22954545454544 (+/-) 6.998135451203301 Precision: 0.6795280633449224 Recall: 0.6822954545454546 F1 score: 0.6800028141616109 Testing Time: 0.0015651095997203481 (+/-) 0.0004924211515183373 Training Time: 2.7061337449333887 (+/-) 0.039544123731576994 === Average network evolution === Total hidden node: 14.71111111111111 (+/-) 2.0069016719778907 === Final network structure === 1 -th layer hiddenLayerBasicNet( (linear): Linear(in_features=8, out_features=16, bias=True) (activation): Sigmoid() (activationh): ReLU(inplace=True) ) No. of inputs : 8 No. of nodes : 16 No. of parameters : 152 2 -th layer outputLayerBasicNet( (linearOutput): Linear(in_features=16, out_features=2, bias=True) ) No. of inputs : 16 No. of output : 2 No. of parameters : 34
========== Performance ========== Preq Accuracy: 68.22 (+/-) 0.72 F1 score: 0.68 (+/-) 0.01 Precision: 0.68 (+/-) 0.01 Recall: 0.68 (+/-) 0.01 Training time: 2.71 (+/-) 0.01 Testing time: 0.0 (+/-) 0.0 ========== Network ========== Number of hidden layers: 1.0 (+/-) 0.0 Number of features: 18.6 (+/-) 4.03